Data Intensive Workflows (a.k.a. scientific workflows) are routinely used in most scientific disciplines today, especially in the context of parallel and distributed computing. Workflows provide a systematic way of describing the analysis and rely on workflow management systems to execute the complex analyses on a variety of distributed resources. They are at the interface of end-users and computing infrastructures. With the drastic increase of raw data volume in every domain, they play an even more critical role to assist scientists in organizing and processing their data and to leverage HPC or HTC resources.
This workshop focuses on the many facets of data-intensive workflow management systems, ranging from job execution to service management and the coordination of data, service and job dependencies. The workshop therefore covers a broad range of issues in the scientific workflow lifecycle that include: data intensive workflows representation and enactment; designing workflow composition interfaces; workflow mapping techniques that may optimize the execution of the workflow; workflow enactment engines that need to deal with failures in the application and execution environment; and a number of computer science problems related to scientific workflows such as semantic technologies, compiler methods, fault detection and tolerance.
The NSF-funded Workshop on Challenges of Scientific Workflows in May 2006
WORKS06 in June 2006
WORKS07 in June 2007
The 2007 IEEE Workshop on Scientific Workflows in July 2007
3rd International Workshop on Scientific Workflows and Business Workflow Standards in e-Science (SWBES) in December 2008
WORKS08 in November 2008 in conjunction with SC'08 in Austin, TX
WORKS09 in November 2009
WORKS10 in November 2010
WORKS 11 in November 2011
WORKS 12 in November 2012
WORKS 13 in November 2013
WORKS 14 in November 2014
WORKS 15 in November 2015