ACM SIGCHI General Interest Announcements (Mailing List)


Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Condense Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
"ACM SIGCHI General Interest Announcements (Mailing List)" <[log in to unmask]>
Sun, 26 Jan 2003 06:57:34 -0000
Julian Newman <[log in to unmask]>
Julian Newman <[log in to unmask]>
text/plain; charset="iso-8859-1"
text/plain (139 lines)
IEEE WETICE is coming to Europe for the first time this summer.  Be part
of it!

IEEE 12th International Workshops on Enabling Technologies:
Infrastructure for Collaborative Enterprises (WETICE 2003)

Call for Participation: Workshop on Evaluation of Collaborative
Information Systems and Support for Virtual Enterprises

June 9-11, University of Linz, Austria


Workshop Co-Chairs: 

Julian Newman, Glasgow Caledonian University, UK
Elaine Raybourne, Sandia Labs, USA

Associate Chair:

Josie (Pi-Hsuan) Huang, Glasgow Caledonian University, UK


Effective collaboration involves people, communication, and the
co-creation of meaning through information sharing that may be
synchronous and/or asynchronous. Researchers and practioners need tools
to measure the incremental progress towards developing useful
collaborative groupware systems, as well as methods to evaluate the
impact of specific technologies on the effectiveness of human to human,
or human to machine collaboration. We believe that developing effective
evaluation methodologies will facilitate progress in designing and
deploying collaborative technologies.  

The primary goal of this workshop is to provide a forum, in which
researchers and practitioners can share tools and evaluation
methodologies of collaborative enterprises, lessons learned from
deployment of collaborative technologies in organizations or educational
institutions, and ideas for directions the area of evaluation must move
toward in order to facilitate the progress of distributed virtual
collaboration. Ultimately, there needs to be a framework or taxonomy
that can answer: 

Which approaches are best for evaluating different types of
collaborative systems? Are different approaches useful for different
phases of analysis? 
In the design cycle of collaborative software development, when are
particular evaluation approaches effective and when are they not? Can a
spectrum be developed? 
What combination of methods and techniques for gathering metrics are
most effective for the situation under evaluation? 
Which evaluation approaches, methods and techniques address
collaboration process and product effectiveness, efficiency and
Which metrics address product and process effectiveness, efficiency, and
satisfaction of the collaborative enterprise under study? 
What evaluation tools and mechanisms are best for generating specific
Topics that contribute to this framework may include: 

Benchmark collaboration scenarios and associated evaluation measures for
groupware system design and development, 
Adaptation of single-user software development and evaluation techniques
to groupware evaluation, 
Groupware design principles or heuristics for use in groupware
evaluation. Analysis of group characteristics (organizational,
behavioral, and technical) and corresponding groupware characteristics, 
Collaboration evaluation methods and tools that use design ethnography, 
Case studies evaluating collaborative enterprises, 
Methods and tools for lowering the cost of evaluating collaborative
enterprises, and 
Methods and tools for effective field study evaluation. 
Additionally, we invite participation from the following perspectives:

Intelligent community-based systems 
Context-aware groupware systems 
Intelligent user interfaces 
Mobile telephony & handheld devices 
Knowledge Management 
Interactive Storytelling Applications 
Software Engineering issues 
Requirements Management 
Assessment of Tool integration and interworking for virtual
Evaluation of IT benefits in the context of virtual organisation 

This workshop is an excellent opportunity to bring together people who
are addressing the unique and challenging needs of collaborative
enterprise evaluation. We welcome delegates from all aspects of industry
and academia. Previous WET ICE workshops have attracted delegates from
areas such as computer science, information science, artificial
intelligence, communication, psychology, sociology, education, human
factors, usability, systems engineering, and library sciences.  

WET ICE is well established as a working conference for the development
of theory and practice.  The evaluation workshops aim at establishing
and disseminating rigorous approaches to collaborative systems
evaluation. We encourage authors to consult the workshop reports and
papers from the previous WET ICE evaluation workshops held in 2000, 2001
and 2002, in order to contribute to developing dialogue, whether by
developing themes from the earlier workshops or by identifying problems,
issues and solutions that have so far been overlooked.


Important Dates 

Full papers due to workshop - March 07, 2003

Notification to authors - April 11, 2003

Final papers for Post-proceedings to IEEE - May 16, 2003 (pdf file, IEEE

Advance registration deadline - May 16, 2003

Final workshop reports by Workshop Chairs - June 27, 2003 (after the

Workshop Dates - June 09-11, 2003

Submission: For Submission and Format requirements, see

Enquiries: Please send all enquiries regarding this workshop to the
Workshop Co-chairs or Associate Chair [log in to unmask]  

For enquiries regarding WETICE in general, contact [log in to unmask]
or call (U.S.) +1-304-293-7226.

Main WETICE 2003 website: