Question: summarize the articel
Introduction
The popularity of mobile devices and applications has continued
to explode, and the number of US adults who own smartphones has
increased from 35% in May of 2011 to 56% in May of 2013 [1].
Outdoor recreation participation has been increasing steadily since
The Outdoor Foundation began documenting participation in 2006,
with 141.9 million participants in the United States getting
outdoors in 2012 and "an average of 87.4 outings per participant
for a total of 12.4 billion outings overall" [2].
The most popular activities included
running, bicycling, fishing, camping, and hiking.
With the ever-climbing popularity of mobile devices and
applications and the widespread and steady interest in outdoor
recreation, comes the need to provide recreation and tourism
information via mobile devices. However, currently there is a lack
of mobile guidelines or standards based on research, especially for
applications or websites that need to provide a large amount of
information to a broad audience.
The Corps Lakes Gateway (CLG) website
provides an enormous amount of information about US Army Corps of
Engineers outdoor recreation resources for visitors. The CLG
website (Fig. 1) received over 47 million hits in 2011, showing the
F.F.-H. Nah (Ed.): HCIB/HCII 2014, LNCS 8527, pp. 371–378, 2014. ©
Springer International Publishing Switzerland 2014 372 S.J.
Swierenga et al. desirability of this information. The Corps’ 422
lakes in 43 states receive 370 million visits a year. To enhance
the experiences of visitors and increase public interaction with
the Corps. our aim was always to create a mobile application to
meet the growing expectation and need of the public for
interactivity at their fingertips.
To learn what types of features and information users expected
for this mobile recreation application, we first conducted the
following studies, with each subsequent study using and building on
the findings of the previous:
The results of these studies were then
used to create a CLG mobile application proto- type, which was
usability tested at two US Army Corps of Engineers (USACE)
projects: Green River Lake and Nolin River Lake, Kentucky, in June
of 2013. The goal of this study was to identify user interface
issues with the CLG mobile prototype in order to improve the user
experience and generate research-based mobile interac- tion design
best practices.
Related Work
Given the growth of mobile devices and
application use, there is a critical need for a set of standards
specifically for mobile interaction design based on research.
Guide- lines have been provided for iOS and Android for developers
using these platforms to employ consistent design, yet these often
"do not make recommendations for mobile websites, which run on
different kinds of operating systems" [3]. Other guidelines have
begun to emerge with practitioners and researchers working
to bridge the gap between established web
standards and standards for mobile. For example, Shneiderman's
"Golden Rules of Interface Design" have been applied to mobile
usability [4], and Nielsen's five attributes of usability [5].
Additionally, Wroblewski’s mobile design strategies are based on
real-world projects [6]. However, these efforts demonstrate the
need for mobile's own set of standards for usability.
Research on the usability of mobile
devices, websites, and applications has and is being conducted, but
the majority of this testing has involved controlled experiments
[5], most in a lab setting. This type of testing can be
problematic, as it can limit the research by not capturing
contextual cues and issues that could be found in a field study
[7]. Therefore, in our study we moved outside of the lab and
conducted usabili- ty testing at two outdoor recreation locations
(campgrounds and boat launches) with a variety of recreation users
to increase the ecological validity of our findings and pre- serve
the context in which this mobile application would be used.
Nielsen and Budiu have recently offered
strategies and guidelines for mobile usa- bility, such as the need
to focus the attention of users only on the essential content [9].
However, the amount of information that needs to be represented on
a recreation or tourism mobile application leads to the issue of
what is essential content for these types of users. Research for
information-intensive mobile websites and applications is needed,
and therefore to increase the validity and accuracy of the results
of our studies for this issue, we used multiple methods throughout
our design process [10].
Evaluation Scope
The goal of this research was to
identify usability issues with the CLG mobile proto- type and
develop best practices for designing mobile user interfaces based
on these findings. Usability refers to how easily a specific task
can be accomplished with a specific tool. More verbosely, the
International Organization for Standardization (ISO) defined
usability as the "extent to which a product can be used by
specified users to achieve specified goals with effectiveness,
efficiency and satisfaction in a specified context of use" [11].
Effectiveness is defined as "accuracy and completeness with which
users achieve specified goals," efficiency is defined as "resources
expended in relation to the accuracy and completeness with which
users achieve goals," and satis- faction as "freedom from
discomfort, and positive attitudes towards the use of the product."
While other conceptualizations of usability have been proposed
[12-14], the ISO definition is the most widely accepted and was
used by the research team.
Evaluation Metrics
Usability was evaluated in terms of its
three constituent components: effectiveness, efficiency, and
satisfaction. Effectiveness was measured as the percentage of tasks
completed successfully. Efficiency was measured as the average time
to perform a task and assessed based on issues observed during
performance of the tasks. Satisfac- tion was measured by post-task
questionnaires, post-study questionnaire, written feedback, and
verbal comments during the session. While effectiveness and
efficiency measures were quantitative, satisfaction was measured
qualitatively.
Evaluation Strategy
Testing was designed to answer the
following questions:
Tasks designed to address these
questions were developed by the project team, which consisted of
researchers with experience conducting visitor studies in park and
out- door recreation settings and performing user-focused
evaluations. The tasks pertained to three phases of a typical
recreation trip: pre-trip planning, on-site activities, and
post-trip reflection and feedback. The mobile application tested
was not "live," since these rural locations have little or no
internet connectivity. However, its basic functionality and several
levels of screens tied to actual project data were made available
to Corps recreation visitors for evaluation.
Participants consisted of a convenience
sample of visitors who were recruited at two Corps of Engineers
lakes in Kentucky: Green River Lake and Nolin River Lake, with the
permission of the park managers. To maximize diversity in the
sample, the Corps managers advised the research team of two
campgrounds (one at each lake) where the team was likely to
encounter a fairly large number of visitors who would be
participating in a variety of outdoor recreation activities.
Procedure
The research team occupied a campground
site and set up the data gathering station consisting of a table,
tablet computer with mobile app mock-up and recording software,
video camera for recording tablet usage, and umbrella to darken the
screen for ease of viewing in a bright, outdoor setting. Visitors
were recruited by walking around the campground and asking for
volunteers, who would receive $25 compensation for the 30-minute
sessions. After reviewing and signing the consent forms and filling
out a demographic questionnaire, participants performed up to nine
tasks and filled out a post-study survey on-site about their
experience with the mobile application prototype. Both audio and
video recordings of the interviews were made using TechSmith’s
Morae® (v3.2.1) software.
In the Usability/Accessibility Research
and Consulting research lab at Michigan State University, two
independent reviewers, working with written observation notes,
reviewed the Morae recordings to transcribe relevant user quotes,
compute task com- pletion times, and record difficulties and
successes in completing the tasks. This anal- ysis was verified by
the principal investigators and discussions with the independent
reviewers took place to resolve any differences in interpretation
of the data.
Participants
Twelve participants, six females and
six males, took part in the usability testing of the prototype at
Green River and Nolin River Lakes (with 6 users participating at
each site). Participants’ ages ranged from 19 to 72; five
participants were between 18 and 30 years old, two were between 31
and 40 years old, and four were between 51 and 80.
Question: summarize the articel Introduction The popularity of mobile devices and applications has continued to explode,
-
- Site Admin
- Posts: 899603
- Joined: Mon Aug 02, 2021 8:13 am