folder Filed in Case Studies
Open Data for Latin America
Improve the experience of the open data platform generating baseline metrics. Introduce team to user research practice and user-centered processes.
admin comment 0 Comments access_time 7 min read

DATAL is a platform for Open Data and open code publication with documentation is done in Spanish. Created to evolve the interaction with the governments, improve citizen participation and access to valuable information in Latin America. Powered by Junar a cloud-based Open Data Platform that ´s already been helping to transform government data into resources that citizens can use. The City of Palo Alto, Sacramento, The government of Chile and Bahía Blanca in Argentina are some of their clients.


I was the UX lead and helped designers with IxD and visual design. Also planned, recruited and conducted remote usability testing in English and Spanish. My deliverables were: high fidelity wireframes, GUI, the visual style of some flows and usability reports. I worked close to the developers and supported the team through redesign implementation.


12 weeks

1 round of usability testing  (USA, Argentina + Chile)

Analysis & research

The redesign had already started when I joined the team. The driver for these changes were clients themselves and the support leader. He had contact with current and new customers on a daily basis. I had different sessions with him to elaborate on the hypothesis and immerse him in the UCD process as well. Designers were full of ideas as well and wanted to introduce some changes to the interface long ago. So I performed a heuristic review of the interface. I was able to identify user journeys and pain points. For this stage, onboarding was still going to be facilitated by the support team so we focused on the publishing journey. With the outcome, we did minor tweaks, including CSS and messages reviews. The goal was not to invest more effort without having users input.

Everyone welcomed the initiative and quickly with the commercial team I was able to put up a file for them to help me know a little bit more about their current users. With this, I contacted and recruited participants while creating the test script.

It´s amazing how much teams can be surprised by some facts of their current clients they weren´t aware of, including client´s will to participate and provide feedback. Each area holds different interests in this information but all of them start to see the value of data and a closer relationship to their user’s usage habits and needs.


We would then identify the usability issues in the publishing flow, understand the users mental model and obtain insight for future improvements.

We tested with men and woman between 30 to 50 years old. Journalists, system analysts, and designers from Argentina, Chile, and the USA.

All tests were recorded with the participant’s agreement for further team review.

With amazing help from development, we had a testing environment specially set up for testing purposes.  (After the usability testing  this environment was kept for front end development to push changes that everyone could check at any time). This team was fully remote so I sent out the testing agenda with expected time for each participant, explained the protocol and set expectations for the team. I told them they were merely going to be observers and no matter what happened none of them could interfere with the participants’ tasks. We used Hangouts for the tests and to share with the team internally.


Sorting out some connectivity issues team attendance was overwhelming and a great sign of the interest in the process itself.

Testing sessions have an amazing impact on teams. It is no longer about individuals opinions, it opens a door for valuable information for decision making. there´s an unequivocal impact in seeing your users struggle with a task you thought should be easier. It´s very difficult to ignore.
Findings & solutions

I built and presented a report to the full team including recommendations and possible next steps. We obtained usability baseline metrics for effectiveness, efficiency, and satisfaction of the interface for the publishing journey.

We verified issues like labels that were giving a hard time to none technical users. Also, tips and suggestions with long explanatory texts that were dismissed.  We saw forms fields that weren´t fully comprehended or working and therefore left in blank. Some feedback was missing in several steps of the process mostly when it came down to loading or saving progress. Also, previews generated doubts and participants weren´t fully sure if they were going to get exactly what they wanted. Users expected to interact with the content that hadn´t any possible interaction, this exposed that affordances had also room for improvement. Only 30 % of the test participants could publish successfully.

Acquiring a user-centered language.

A couple of weeks after the test I continued working with the project. The team had prioritized a list of features now and started to work on them.

By working with features you risk putting effort on a solution that might not be the best answer to the problem. Because of that, I advise teams to adopt user stories, a tool used in Agile development to capture a description of a feature from an end-user perspective.  They generally have this structure:  As a <user type>, I want <goal> so that <benefit>.

It´s important to define the problem to be addressed but writing is not enough, discussions about each of the stories need to occur to make it work.

Prototypes and visual design

Designers were incredible proactive and we had ideation sessions, showed progress early and embraced feedback. This is not an easy thing to achieve and is ¨tested¨ on a daily basis. I pushed them, even though they already had many solutions sometimes the bias of being working for so long in something prevents you from taking higher risks. I would work on the initial proposals and visual style, recover focus when needed and promote a modular design process where we could go from the smallest item of a component to the composition of the template. This was new to them and easily adopted.

I broke down a component into all the possible states.

They used Dropbox for file sharing so we built a shared Dropbox folder for designs and set some rules so any of us could open a file at any given time. This takes time of course, but it worked out fine.

using live share invision
Using Invision´s live share tool to provide feedback to designs


All wireframes of publishing flow to verify systemic decisions, page structures, grids & breakpoints.
Final thoughts

The released version certainly had usability and product improvements (though it didn’t yet do everything we intended) but mostly, it showed that research and design can have a measurable benefit to the business.

Adopting a user-centered process requires a cultural change in the group and this experience encouraged that positive change.


This project was presented in September 2015 in the context of  ConDatos 2015, the Latin American conference for open data.

government Latin america open data