
users’ ability to understand, interact with, and derive
insights from the presented data. Therefore, dash-
boards should be clear, simple, and present informa-
tion concisely (Bach et al., 2022). Usability is consid-
ered an indicator of a system’s efficiency and effec-
tiveness, assessing its ease of use and the user’s abil-
ity to successfully complete tasks (Best and Smyth,
2011). User experience, on the other hand, encom-
passes all aspects related to people’s interaction with
a product, including the user’s perception of how they
feel when using it, their understanding of how the sys-
tem works, and the product’s suitability for the con-
text in which it is used (Maia et al., 2020).
Although there are related works on the develop-
ment of dashboards and User Experience (UX), the
number of methods specifically focused on evaluating
user experience (Almasi et al., 2023) and usability is
still low (Silva et al., 2018). Both usability and UX
are considered key factors for the success of a system
(Hassan and Galal-Edeen, 2017). However, it is no-
ticeable that there is still a lack of technologies that
support dashboard design, integrating the concepts of
usability and UX, and that have been applied for val-
idation and evaluation.
This paper aims to contribute to the process eval-
uating and redesigning dashboards, proposing a more
appropriate interface that can help improve the qual-
ity of these tools from the perspective of end users.
The idea is to offer a set of artifacts that contribute to
the development of dashboard interfaces. Initially, we
propose the creation of a checklist, which will sup-
port and guide developers in evaluating dashboards,
ensuring that usability and user experience guidelines
are being met. Then, we suggest applying design pat-
terns to help meet the quality attributes regarding us-
ability and UX, by providing solutions and examples.
Doing so, we intend to reduce rework and avoiding
potential ambiguities in identifying system errors and
solutions.
The remainder of the paper is organized as fol-
lows. Section 2 presents a review of related works in
this area. Section 3 identifies attributes from a liter-
ature review and proposes a checklist for evaluating
dashboards. Section 4 presents the project context in
which the checklist was applied, while also present-
ing suggestions for correcting the identified problems.
Section 5 concludes the paper and outlines future per-
spectives for this research.
2 RELATED WORK
There is a growing concern in designing high-quality
systems that present information effectively and ob-
jectively to users (Praharaj et al., 2022). In this con-
text, several studies have highlighted the development
of information systems and dashboards aimed at sup-
porting decision-making processes, while focusing on
their ease of use (Almasi et al., 2023) (Maceli and Yu,
2020) (Smuts et al., 2015). Below, we present some
of these studies and their contribution to the field.
In their research, Almasi et al. (2023) reviewed
existing questionnaires used to assess the usability of
dashboards and suggested some criteria for this eval-
uation. The criteria include utility, operability, ease
of learning, customized questionnaire, improved sit-
uational awareness, user satisfaction, user interface,
content, and system capabilities. The authors empha-
size that when selecting criteria to evaluate the usabil-
ity of dashboards, it is essential to consider the study’s
objectives, the characteristics and capabilities of the
panels, and the context of use.
The research by Enache (2021) involved the de-
sign and analysis of a dashboard for runners, with the
goal of preventing injuries. The evaluation of users of
the interactive running dashboard was conducted us-
ing the SUS (System Usability Scale) to measure the
score and develop a dashboard.
In another study by Maceli et al. (2020), the au-
thors aimed at exploring the design and evaluation of
the dashboard through usability testing and heuristic
evaluation of an open-source data dashboard interface
for archivists. An heuristic evaluation of the environ-
mental monitoring dashboard was conducted, evaluat-
ing the interface against Nielsen’s ten usability heuris-
tics.
Similarly, Smuts et al. (2015) aimed to investigate
the usability of Business Intelligence (BI) tools that
support the development of dashboards, with a spe-
cific focus on novice users. The main research prob-
lem addressed in this paper is the complexity in the
development process of dashboards in traditional BI
tools.
Sarikaya et al. (2018) aimed at discovering and
identifying different types of dashboard designs. To
achieve this, they conducted a multidisciplinary liter-
ature review to understand practices surrounding the
use of dashboards. The review allowed them to build
a characterization of the uses and domains of dash-
boards and identify issues that the literature consid-
ers urgent. Later, Bach et al. (2022) described the
process of creating design patterns for dashboards,
which involved a systematic literature review on dash-
boards and data visualization. A total of 144 dash-
boards were collected considering an initial selec-
tion of 83 dashboards gathered by (Sarikaya et al.,
2018). The authors provided insights into the design
of dashboards, offering applicable design knowledge
Applying Checklist and Design Patterns for Evaluating and Redesigning a Dashboard Interface of a Decision Support Information System
495