Improving the usability of the COEIT research website – A UX case study.

TIMELINE: January 2020 – May 2020

Searching a website should be an easy task. However, this may not always be the case. Some websites may present information in a way that the user often leaves to check out other websites. The College of Engineering and Information Technology (COEIT) research website is one such website. This was a semester project and I teamed up with Hirak Ray who is currently pursuing his Ph.D. His inputs over the course of the project were invaluable along with the instructors.

My role: Conduct User research and synthesize the data to improve the usability of the College of Engineering and Information Technology’s research website as a part of a semester-long project.

Team Members: Anirudh Nagraj , Hirak Ray


With the growing number of researchers and internet readers alike, how can we improve the usability of the COEIT research website of UMBC to better showcase the research done at the institution?


Double Diamond Method

We used the double diamond method throughout the design process. We collaborated using Google Docs and enabled a divergent thinking process. At the end of each week, we would converge and populate our ideas and work on them in an iterative manner.

We worked independently in the beginning to explore some of the critical questions that was projected from looking at the website at a glance. We wanted to know who the users are, which other university or lab has a research website, what were some of the initial usability concerns identified by past users that remained unanswered. What is the metric for the site to perform well, etc.,

User Analysis

The COEIT research website is a source of information regarding research in UMBC. It is used by students aged 18-35, as well as professors (Age 30-70). External researchers may also use the website to learn more about research being conducted at UMBC.

Competitor Analysis

We wanted to know what the other research websites offer, both in terms of the positives and the negatives. So each of us went into searching for other university research websites. We compiled our findings and narrowed in on 5 websites for reference and analysis.

  1. Carnegie Mellon University (Research Section) –
  2. New York University (Research Section) –
  3. Massachusetts Institute of Technology (Research Section) –
  4. Palo Alto Research Center (PARC) –
  5. University of Maryland College Park (Research Section) –

Some Usability Concerns – As identified previously by users

We spoke to the website administrator to get to know what the initial usability concerns are and they gave us a document entailing the issues. Some of the issues are highlighted below. This gave us an idea on how to go about things when we would start our research phase of the project.

  1. Inconsistent content
  2. Limited to no end-user engagement
  3. The site should be designed for an assortment of users
  4. Verbose
  5. No clear pathways on research content.

We wanted to focus on getting more out of the website by conducting thorough User Research. Both of us were particular about obtaining as much Qualitative Data as possible. We brainstormed on whether we wanted to conduct generative research or evaluative research. In the end, we opted for a hybrid approach because we wanted to have concrete data on our hands so that we could easily spot the main improvement points when we began designing eventually.

Contextual Inquiry

Once the competitive analysis was out of the way, we proceeded to our first real step of bringing in the users into play – The Contextual Inquiry. Contextual Inquiry gave us qualitative data on some of the user’s thoughts based on pure observations and think-aloud. The participants were picked in such a way that their studies and interests are at different ends of the spectrum. This allowed us to gain a wider range of data given the diversity. This was the first step in the research for the project.

User Interviews

Continuing with our generative research, we began recruiting participants. Our participants comprised of students, faculty, and researchers alike. A SEMI STRUCTURED FORMAT was followed. We prepared a protocol and the questions of our own and included the permission to record/take notes. To ensure confidentiality, we included the assurance that personal data would not be used anywhere. Although we proceeded with the general flow of navigation, we did probe the user at specific points to get the most out of the interview. Probing the user is like threading a thin line; the users should not be pressurized at any point in time during the course of the interview.

This gave us loads of qualitative data and we were able to dive into some of the subtle aspects of improvement for the website.

Example notes for the interviews.

As the tasks were being executed, we proceeded to take notes that allowed us to code the data, thereby effectively analyzing the user’s actions. This analysis leads to us chalk out patterns and identify the major pain points of the user.


Heuristic evaluations were conducted on the College of Engineering and Information Technology (COEIT) research section to determine the extent of the website’s functionality and its effect on user experience. To achieve this, we strived to identify specific issues with the website, in accordance with Nielsen’s Ten Heuristics. We converged and explored the website and evaluated the interface to find usability problems. These problems were recorded in the form of a written report using Usability Aspect Reports (UARs).

In the UARs, Name refers to the element of the website which has an issue, Evidence refers to the heuristic which is being violated, and Explanation describes the issue in detail. One evaluator found 9 issues, whereas the other found 11. Out of these 20 issues, one issue was mutually identified and highlighted, bringing the total to 19 UARs.

UAR of the issue highlighted by both members of the team.

Overall, the website appears to be quite usable and functional, but is highly prone to cause annoyance and irritation for users, due to multiple usability concerns which have relatively simple solutions. Multiple issues found by the evaluators also coincidentally relate to common concerns held by our participants from previously conducted studies involving contextual inquiries, direct observations and interviews, which shows common ground between experiences among experts and novices alike.

Participatory Design

Notes and boards trying to prompt the user

Brainstorming ideas and coming up with challenges was quite an arduous task. However, we managed to point out the navigational loop (navigating from one page to the other without knowing what to click to come back to the initial page), and the other difficulties faced by the user like the headings and its sizing as challenges worth considering. We noticed that headers were not synchronous with the text, and caused a lot of confusion among people. This led us to take some time out and conduct a participatory design session to get a better idea of what the design is going to look like.


Skeletal structure laid out on cards

After a 10 minute session of general navigation, we explained the major usability challenges identified from our heuristic evaluation to the participant. We also asked our users to draw what they felt was the flow that was accurate according to 1them.

We used index cards to draw skeletal site structures and asked the users for their opinion and what changes would they make to the same. This aided in the discussion and was an added bonus to our participatory design process.


We created two personas to reflect the key characteristics of our users, which aided us in developing an understanding of current users, as well as potential users.

Persona 1

Persona 2

** all images used for personas were obtained as stock images online.

Low Fidelity Mockups

We continued to develop prototypes of low fidelity using a combination of sticky notes and sketches. The design of the prototypes was mainly motivated through our Participatory Design session with all the 6 participants involved, as well as considering the UAR from the heuristic evaluations (which showed mutuality with findings from previously conducted contextual inquiries and interviews)

The designed sketches may not have correctly encapsulated our participant’s suggestions. To verify this, member checking (a technique for exploring the credibility of results. Data or results are returned to participants to check for accuracy and resonance with their experiences) may be used. Furthermore, the fidelity of the prototypes will require improvement to highlight any further complications.

Medium Fidelity Mockups

Converging again, we set out to make medium fidelity mockups to bring to life the content on paper. We used to make these sketches. Medium fidelity prototypes gave us an idea about the placement and the working of the new designs. We were careful to incorporate incremental development to the website as we firmly believed that radical changes would have a negative impact on the users.

High Fidelity Mockups

We added minor changes to the website as we believed from incremental development right from the off. We proceeded to add a home button to round out the navigation. A feedback button that is evident and easy to identify. Additionally, A color change that works best with the design scheme of the website.


SUB Pages


Now the website uses Google Forms. We made it more intuitive and in sync with the color scheme and made a personalized form. This includes a back button for the user to go back in case they want to head back to the main page.


Considering all the data collected, synthesized, the researchers came together to collaborate on Adobe XD to create a prototype that had incremental developments to the current version of the website. These minor increments played a major part in enhancing the usability of the current website.