ICIS Research Publications


2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2004 2003 2002 2001 2000 1999 1998 1997 1996 1995 1994 1993 1992 1991 1990 1989 1988 1987 1986 1985 1984 1983 1982 1981 1980 1979 1978 1977 1976 1975 1974 1973 1972 1971 1970 1969 1968 1967 1966 1965 1964 1963 1962 1960 1959 1958 1957 1956 1955 1953 1952 1951 1950 1949 1948 1947 1946 1945 1941 1938 1923 1911

Unclassified

G. Antoniou, and F. van Harmelen. A Semantic Web Primer. The MIT Press, Cambridge, Massachusetts, USA, 2004, ISBN 9780262012102.

[ Missing PDF ] [ Bibtex ]

D Brickley, and R. Guha. RDF Vocabulary Description Language 1.0: RDF Schema, W3C Recommendation 10 February 2004. Technical report, February, W3C, 2004, Last checked: 04-Okt-2005.

[ External URL ] [ Bibtex ]

R.O. Briggs. On Theory-Driven Design and Deployment of Collaboration Systems. Lecture Notes in Computer Science 3198, Springer-Verlag, 2004, ISBN 3-540-23016-5.

[ Missing PDF ] [ Bibtex ]

J. Cardoso, A. Sheth, J. Miller, J. Arnold, and K. Kochut. Quality of service for workflows and web service processes. In: Web Semantics: Science, Services and Agents on the World Wide Web, Nr: 3, Vol: 1, Pages: 281-308, April, 2004.

[ PDF ] [ Bibtex ]

J.L.G. Dietz, and T.A. Halpin. Using DEMO and ORM in Concert: A Case Study.. In: Advanced Topics in Database Research, Vol. 3, Pages: 218-236, 2004.

The Demo Engineering Methodology for Organizations (DEMO) enables business processes of organizations to be modeled at a conceptual level, independent of how the processes are implemented. DEMO focuses on the communication acts that take place between human actors in the organization. The Object-Role Modeling (ORM) approach enables business information to be modeled conceptually, in terms of fact types as well as the business rules that constrain how the fact types may be populated for any given state of the information system and how derived facts may be inferred from other facts. ORM also includes procedures to map conceptual data models to physical database schemas. Both DEMO and ORM treat fact types as fundamental, and require that their models be expressible in natural language sentences. This suggests that the approaches may be synthesized in a natural way, resulting in a more powerful method for business modeling. This paper discusses an exploratory case study in which both methods were used in concert, and identifies some lessons learned.

[ PDF ] [ Bibtex ]

P Donzelli, and B. Bresciani. Improving requirements engineering by quality modelling - a quality-based requirements engineering framework. In: Journal of Research and Practice in Information Technology, Nr: 4, Vol: 36, November, Australian Computer Society, (ACS), 2004.

[ Missing PDF ] [ Bibtex ]

Cynthia Dwork, Moni Naor, and Amit Sahai. Concurrent zero-knowledge. In: J. ACM, Nr: 6, Vol: 51, Pages: 851-898, ACM, New York, New York, USA, 2004, ISSN 00045411.

[ PDF ] [ Bibtex ]

Mehmet Emin Dönderler, Özgür Ulusoy, and Ugur Güdükbay. Rule-based spatiotemporal query processing for video databases. In: The VLDB Journal, Nr: 1, Vol: 13, Pages: 86-103, Springer, 2004, ISSN 10668888.

[ PDF ] [ Bibtex ]

Michael Gertz, M. Tamer Özsu, Gunter Saake, and Kai-Uwe Sattler. Report on the Dagstuhl Seminar: `data quality on the Web`. In: SIGMOD Rec., Nr: 1, Vol: 33, Pages: 127-132, ACM, New York, New York, USA, 2004, ISSN 01635808.

[ PDF ] [ Bibtex ]

R.L. Glass, V. Ramesh, and I. Vessey. An Analysis of Research in Computing Disciplines. In: Communications of the ACM, Nr: 6, Vol: 47, Pages: 89-94, June, 2004.

Many people consider the contemporary time period the "era of computing." Indeed, technological advances in the computing field are leading the world in many exciting new directions. Research is, of course, a primary mechanism by which the computing field initiates its advances. It is our intention here to analyze the field of computing by examining computing research, in order to better understand where the field has been, and to consider where it may be going. To accomplish this, we break the computing field down into its most common academic subdivisions: computer science (CS), software engineering (SE), and information systems (IS). With those three disciplines in mind, we examine the following questions: How different are the topics upon which they do research? How similar are research approaches and research methods? Upon which reference disciplines do they draw? At what level of analysis is research typically conducted?

[ PDF ] [ Bibtex ]

J. de Graaff. A Repeatable Process for Collaborative Knowledge Gathering. Delft University of Technology, Delft, The Netherlands, EU, 2004.

[ Missing PDF ] [ Bibtex ]

R. Harder, and H. Higley. Application of Thinklets to Team Cognitive Task Analysis. , IEEE Computer Society Press, Washington DC, USA, 2004.

As was seen in the 2003 Gulf War, the U.S. Army is migrating to a lighter, more mobile and more lethal fighting force. In support of this desired paradigm, the major U.S. Army’s battle laboratories are performing key experiments to determine how the projected mix of personnel, materiel, and doctrine can be interwoven into a desired structure for the year 2015. In February 2003, the U.S. Army’s Battle Command Battle Laboratory (BCBL-L) at Fort Leavenworth, Kansas appraised, utilizing a more intuitive decision-making process, a simulated futuristic two thousand-man force structure with its anticipated equipment. There were several stated, and a few unstated, objectives to this experiment, necessitating the use of numerous observers and several data collection instruments. Of the data collection instruments, the principal instrument for Team Cognitive Task Analysis functions was the Wagon Wheel interview methodology. Of the tools, the principal tool employed to collaborate, consolidate and sustain the data collection events was a Group Support System. This paper will first explore how the selected Group Support System tools were utilized to automate the Wagon Wheel process from a one-on-one manual process to an automated system that enabled simultaneous data collection from 20 individuals. This paper will then examine how the concept of ThinkLets was used to define the Wagon Wheel process. Lastly, an exchange of ideas will be provided, talking about the strong and weak points of using a Group Support System in the experiment, the problems that arose and the solutions employed, and some thoughts for using a Group Support System in the follow-on experiments.

[ PDF ] [ Bibtex ]

R. Kishore, H. Zhang, and R. Ramesh. A Helix-Spindle Model for Ontological Engineering. In: Communications of the ACM, Nr: 2, Vol: 47, Pages: 69-75, 2004.

Recently, there has been increasing recognition that ontological principles can be effectively applied to the broader field of information systems. This has led to the notion of 'ontology-driven information systems' covering both the structural (the content including the database, user interface, and application program components) and the temporal (the development and operation processes) dimensions of IS. In the structural dimension, ontological abstractions of IS artifacts will lead to improved communication through common vocabularies, enhanced software reuse, and systems interoperability. In the temporal dimension, effective and efficient conceptual analysis and design processes can be obtained through new ontology-driven IS development approaches.

[ PDF ] [ Bibtex ]

G Klyne, and Carroll J.J.. Resource Description Framework (RDF): Concepts and Abstract Syntax, W3C Recommendation 10 February 2004. Technical report, February, W3C, 2004, Last checked: 15-Sept-2005.

[ External URL ] [ Bibtex ]

G.L. Kolfschoten, R.O. Briggs, J.H. Appelman, and G.J. de Vreede. Recurring Patterns of Facilitation Interventions in GSS Sessions. In: Proceedings of the 37th HICSS, IEEE Computer Society Press, Washington, DC, USA, 2004.

[ Missing PDF ] [ Bibtex ]

G.L. Kolfschoten, R.O. Briggs, J.H. Appelman, and G.J. de Vreede. ThinkLets as Building Blocks for Collaboration Processes: A Further Conceptualization. , Springer Verlag, 2004.

In the past decade, there has been a steady increase in the importance of collaboration to value creation in organizations, which has given rise to a new research field. Collaboration Engineering aims to model, design, and deploy repeatable collaboration processes to be executed by practitioners themsleves of high-value recurring collaborative tasks. Thus the aim of collaboration engineering is to create ready made designs for group processes. A key concept in Collaboration Engineering is a thinkLet - a codified facilitation intervention in a group process to create a desired pattern of collaboration. This paper presents an analysis of the thinkLet concept and possible thinkLet classification schemes to support collaboration engineers in effectively designing collaboration processes.

[ PDF ] [ Bibtex ]

M.A. Koops. Reliability and the value of Information. In: Animal Behaviour, Nr: 1, Vol: 67, Pages: 103-111, 2004.

When animals lack the ability to discriminate between correct and incorrect information, they must choose to either respond to or ignore an information source. I de ne the reliability of information as the probability that information is correct and introduce reliability into the value of information to explore the level of risk information consumers should incur when using information. I nd that when information consumers do not control the reliability of information, they should be more discriminating as the bene t of correct information declines, as the cost of misinformation increases and when acquisition costs are greater. However, when consumers pay for reliability, I nd that consumers should be more discriminating as the bene t of correct information increases, the cost of misinformation increases, and when acquisition is cheaper. Application of this theory suggests that: (1) selection for individual recognition should be stronger when the cost of misinformation (deception) is high; (2) mimics can outnumber models when models are very noxious or alternative prey are abundant; and (3) memory about predators should be longer than memory about prey.

[ PDF ] [ Bibtex ]

M.E Kraft, and S.R. Furlong. Public Policy: Politics, Analysis, and Alternatives. CQ Press, Washington, D.C., USA, 2004.

[ Missing PDF ] [ Bibtex ]

R. L`ammel. Transformations Everywhere. In: Science of Computer Programming, Nr: 1-3, Vol: 52, Pages: 1-8, August, 2004.

[ Missing PDF ] [ Bibtex ]

Steve McConnell. Code complete, a practical handbook of software construction. 2 edition, Microsoft Press, Redmond, Washington, USA, 2004.

[ Missing PDF ] [ Bibtex ]

TOGAF - The Open Group Architectural Framework. The Open Group, 2004.

TOGAF is a framework - a detailed method and a set of supporting tools - for developing an enterprise architecture. It is described in a set of documentation published by The Open Group on its public web server, and may be used freely by any organization wishing to develop an enterprise architecture for use within that organization.

TOGAF was developed by The Open Groups own members, working within the Architecture Forum. The original development of TOGAF Version 1 in 1995 was based on the Technical Architecture Framework for Information Management (TAFIM), developed by the US Department of Defense. The DoD gave The Open Group explicit permission and encouragement to create TOGAF by building on the TAFIM, which itself was the result of many years of development effort and many millions of dollars of U.S. government investment.

[ External URL ] [ Bibtex ]

Donn. B Parker. The folk art of information security needs an upgrade. In: Communications of the ACM, Nr: 8, Vol: 47, Pages: 11-12, August, 2004.

[ Missing PDF ] [ Bibtex ]

R. Schwitter. Controlled Natural Languages. Centre for Language Technology, Macquary University, 2004.

[ External URL ] [ Bibtex ]

C.H. Scott, and J.E. Scott. On models for the operation of a class of electronic marketplaces. In: Omega, Nr: 5, Vol: 32, Pages: 373-383, October, 2004.

Innovation in information technology and the use of the Internet have enabled electronic marketplaces to become an important mechanism for linking suppliers and customers in a cost-efficient fashion. Previous research has generally focused on the bene1ts of electronic marketplaces from transaction cost economics and strategic perspectives. Yet very little academic research has addressed how to actually operate such a marketplace. In this paper, after reviewing the status of electronic marketplaces research, we focus on an operational perspective. A model of an electronic marketplace linking customers and suppliers either directly or via an intermediary is given and solved under various scenarios. This model uses a single cost-minimizing objective and the extensions address issues such as (1) a physical presence for the electronic marketplace, which can provide value-added services and preserve anonymity; and (2) sole sourcing or dual sourcing. An additional model explicitly represents the diverse objectives of the multiple players in the market using goal programming. The contribution of this research to practitioners is to offer a cost-effective alternative to current forms of allocating supply and demand. The cost-minimizing and multiple criteria models and extensions in this study make a contribution to research by expanding the horizons of previous studies on the operation of electronic marketplaces.

[ PDF ] [ Bibtex ]

M.W.A. Steen, H.W.L. ter Doest, M.M. Lankhorst, and D.H. Akehurst. Supporting Viewpoint-Oriented Enterprise Architecture. In: Proceedings of the 8th IEEE International Enterprise Distributed Object Computing Conference (EDOC 2004), Pages: 20-24, September, 2004.

[ Missing PDF ] [ Bibtex ]

Vitruvius. October, Wikipedia, 2004, Last checked: 26-Oct-2004.

[ External URL ] [ Bibtex ]

G.J. de Vreede. Collaboration Engineering: Designing for Self-Directed Team Efforts. In: Proceedings of the Shaping the Future of IT, 2004, Conference, Tucson, AZ, November 3-5.

[ Missing PDF ] [ Bibtex ]

H. Weigand, and A. de Moor. Argumentation semantics of communicative action. In: Proceedings of the 9th Interntional Working Conference on the Language-Action Perspective in Communication Modelling (LAP 2004), Edited by: M. Aakhus, and M. Lind. June, Rutgers University, New Brunswik, USA, 2004.

[ Missing PDF ] [ Bibtex ]

D.L. McGuinness, and F. van Harmelen. OWL Web Ontology Language: Overview. Technical report, February, W3C, 2004, Last checked: 15-Sept-2005.

[ External URL ] [ Bibtex ]

Journal

B. van Gils, H.A. (Erik) Proper, and P. van Bommel. A conceptual model for Information Suppy. In: Data & Knowledge Engineering, Vol: 51, Pages: 189-222, 2004.

This paper introduces a conceptual model for information supply. The model is based on the notion that similar information can be conveyed by multiple representations. Hence, we define that information services provide access to several representations. The relation between these two concepts is provided by what we call features. Furthermore, we recognize the fact that information services are interrelated by means of relations. We informally introduce this model using the ER-notation, and provide a formalization as well. Last but not least, the notion of transformations is introduced. Using transformations we are able to transform representations to a type that is both acceptable and useful for the user of an IR-system.

[ PDF ] [ Bibtex ]

Santi Caball�, P. Dague, F. Levy, J. Montmain, M. Staroswiecki, and L. Trave-Massuyes. Conflicts versus analytical redundancy relations: a comparative analysis of the. In: IEEE Transactions on Systems, Man and Cybernetics, Nr: 5, Vol: 34, Pages: 2163-2177, October, 2004.

Two distinct and parallel research communities have been working along the lines of the model-based diagnosis approach: the fault detection and isolation (FDI) community and the diagnostic (DX) community that have evolved in the fields of automatic control and artificial intelligence, respectively. This paper clarifies and links the concepts and assumptions that underlie the FDI analytical redundancy approach and the DX consistency-based logical approach. A formal framework is proposed in order to compare the two approaches and the theoretical proof of their equivalence together with the necessary and sufficient conditions is provided.

[ PDF ] [ Bibtex ]

T. Heskes. On the uniqueness of loopy belief propagation fixed points. In: Neural Computation, Vol: 16, Pages: 2379-2413, 2004.

[ PDF ] [ Bibtex ]

H. Jonkers, M.M. Lankhorst, R. van Buuren, S.J.B.A. (Stijn) Hoppenbrouwers, M. Bonsangue, and L. van der Torre. Concepts for Modeling Enterprise Architectures. In: International Journal of Cooperative Information Systems, Nr: 3, Vol: 13, Pages: 257-288, 2004.

A coherent description of an enterprise architecture provides insight, enables communication among stakeholders and guides complicated change processes. Unfortunately, so far no enterprise architecture description language exists that fully enables integrated enterprise modelling, because for each architectural domain, architects use their own modelling techniques and concepts, tool support, visualisation techniques, etc. In this paper we outline such an integrated language and we identify and study concepts that relate architectural domains. In our language concepts for describing the relationships between architecture descriptions at the business, application, and technology levels play a central role, related to the ubiquitous problem of business­IT alignment, whereas for each architectural domain we conform to existing languages or standards such as UML. In particular, usage of services offered by one layer to another plays an important role in relating the behaviour aspects of the layers. The structural aspects of the layers are linked through the interface concept, and the information aspects through realisation relations.

[ PDF ] [ Bibtex ]

P. Kardassis, and P. Loucopoulos. Expressing and Organising Business Rules. In: Information and Software Technology, Nr: 46, Pages: 701-718, 2004.

Business rules represent projections of organisations’ constraints and ways of working on their supporting information systems. Therefore, their collection, expression, structuring and organisation should be central activities within information systems analysis. The work presented in this paper concerns the definition of a repository schema for managing business rules, taking into account the objectives (a) to facilitate the rule collection process, (b) to assist the transition from analysis to design and implementation of the information system, and (c) to support business change once the new system has been delivered. These objectives are achieved through the enhancement of the rule repository schema with information on the logistics of the collection process and references to underlying enterprise informational and behavioural knowledge models. The proposed schema and way of working are demonstrated through a number of examples, which are derived from an industrial project concerning electronic procurement in the construction sector.

[ PDF ] [ Bibtex ]

Gaag van der L.C Lucas, and A. Abu-Hanna. Bayesian networks in biomedicine and health-care. In: Artificial Intelligence in Medicine, Vol: 30, Pages: 201-214, 2004.

[ Missing PDF ] [ Bibtex ]

P.J.F. Lucas. Bayesian analysis, pattern analysis and data mining in health care. In: Current Opinion in Critical Care Medicine, Vol: 10, Pages: 399-403, 2004.

[ Missing PDF ] [ Bibtex ]

H.A. (Erik) Proper. Editor`s Introduction.. In: International Journal of Cooperative Information Systems, Nr: 3, Vol: 13, Pages: 211-212, 2004.

[ External URL ] [ Bibtex ]

G.E. Veldhuijzen van Zanten, S.J.B.A. (Stijn) Hoppenbrouwers, and H.A. (Erik) Proper. System Development as a Rational Communicative Process. In: Journal of Systemics, Cybernetics and Informatics, Nr: 4, Vol: 2, International Institute of Informatics and Systemics (IIIS), 2004.

System development is a process in which communication plays an important role. Requirements must be elicited from various stakeholders. But stakeholders also make decisions and must understand the consequences thereof. Different viewpoints must be reconciled, and agreements reached. An important assumption we make is that all actions in the development process are (or should be) based on rational decisions. The quest for rationality is a driving force behind the communication that takes place within the development process, because it raises issues that may otherwise have remained in the subconsciousness of stakeholders. We zoom in on the role of vagueness in communication, and argue that there are good reasons not to try and formalize things too soon in the development process. The purpose of this paper is to position our ongoing research, encourage discussion about the assumptions we make, and inspire novel approaches to system development. We work towards a comprehensive theory of rational system development, in which due attention is paid to development processes, communication, and the representations used therein.

[ PDF ] [ Bibtex ] [ External URL ]

A.J. Hommersom, J.-J.Ch. Meyer, and E.P. de Vink. Update Semantics of Security Protocols. In: Synthese/Knowledge, Rationality and Action, 2004.

We present a model-theoretic approach for reasoning about security protocols, applying recent insights from dynamic epistemic logics. This enables us to describe exactly the subsequent epistemic states of the agents participating in the protocol, using Kripke models and transitions between these based on updates of the agents\' beliefs associated with steps in the protocol. As a case study we will consider the SRA Three Pass protocol and discuss the Wide-Mouthed Frog protocol.

[ PDF ] [ Bibtex ]

Book

F.A. Grootjen. A Pragmatic Approach to the Conceptualization of Language. Radboud University, Nijmegen, The Netherlands, EU, 2004, ISBN 90-9018934-3.

[ PDF ] [ Bibtex ]

Kroenke D.M.. Databases, principles, design and implementation (Databases, beginselen, ontwerp en implementatie). 9th edition, Pearson Education Benelux, Amsterdam, The Netherlands, EU, 2004, In Dutch, ISBN 9043008427.

[ Missing PDF ] [ Bibtex ]

D.B.B. Rijsenbrij. Architectuur in de digitale wereld (versie nulpuntdrie). October, Nijmegen Institute for Information and Computing Sciences, Radboud University Nijmegen, Nijmegen, The Netherlands, EU, 2004, In Dutch, ISBN 90901882853.

[ Missing PDF ] [ Bibtex ]

Framework of Information Systems Concepts: Revised Report. Edited by: A.A. Verrijn-Stuart, and H.A. (Erik) Proper. IFIP WG 8.1 Task Group FRISCO, 2004.

[ Missing PDF ] [ Bibtex ]

M. van Vliet. Software or Softwar? Dimensies en dilemma`s in softwareprojecten. March, Nijmegen Institute for Information and Computing Sciences, Radboud University Nijmegen, Nijmegen, The Netherlands, EU, 2004, In Dutch, ISBN 9090177582.

[ PDF ] [ Bibtex ]

J. Warmer, and A. Kleppe. Practical UML (Praktisch UML). 3 edition, Pearson Education Benelux, Amsterdam, The Netherlands, EU, 2004, In Dutch, ISBN 9043008125.

[ Missing PDF ] [ Bibtex ]

Conference

P.J.M. Frederiks, and Th.P. van der Weide. Information Modeling: the process and the required competencies of its participants. In: 9th International Conference on Applications of Natural Language to Information Systems (NLDB 2004), Manchester, United Kingdom, EU, Edited by: F. Meziane, and E. Métais. Lecture Notes in Computer Science, Vol: 3136, Pages: 123-134, Springer, 2004.

In recent literature it is commonly agreed that the first phase of the software development process is still an area of concern. Furthermore, while software technology has been changed and improved rapidly, the way of working and managing this process have remained behind.

In this paper focus is on the process of information modeling, its quality and the required competencies of its participants (domain experts and system analysts). The competencies are discussed and motivated assuming natural language is the main communication vehicle between domain expert and system analyst. As a result, these competencies provide the clue for the effectiveness of the process of information modeling.

[ PDF ] [ Bibtex ]

Logics in Artificial Intelligence, Proceedings of 9th European Conference, JELIA`04. Edited by: J.J. Alferes, and J.A. Leite. LNCS, Vol: 3229, September, Springer, 2004.

[ Missing PDF ] [ Bibtex ]

A.I. Bleeker, H.A. (Erik) Proper, and S.J.B.A. (Stijn) Hoppenbrouwers. The Role of Concept Management in System Development - A practical and a theoretical perspective. In: Forum proceedings of the 16th Conference on Advanced Information Systems 2004 (CAiSE 2004), Riga, Latvia, EU, Edited by: J. Grabis, A. Persson, and J. Stirna. Pages: 73-82, June, Faculty of Computer Science and Information Technology, Riga, Latvia, EU, 2004, ISBN 998497670X.

In this article we argue the need for proper concept management during the development of software systems. It is observed how, during system development, a lot of ``concept handling'' occurs without proper management. We define concept management as the deliberate activity of introducing, evolving and retiring concepts. It is argued that concept management plays an important role during the entire system development life cycle.

The notion of concept management is discussed and elaborated from both a theoretical perspective and a practical perspective. The latter perspective considers concept management in the context of the software development practice of a Dutch IT consultancy firm.

[ PDF ] [ Bibtex ]

J.I. Farkas, and J.J. Sarbo. Mathematica utens. , Edited by: H.D. Pfeiffer, H. Delugach, and K.E. Wolff. Pages: 29-42, Shaker-Verlag, Darmstadt, Germany, EU, 2004.

[ PDF ] [ Bibtex ]

M.A.J. van Gerven, and P.J.F. Lucas. Employing Maximum Mutual Information for Bayesian Classification. , Pages: 188-199, Springer-Verlag, 2004.

[ PDF ] [ Bibtex ]

M.A.J. van Gerven, and P.J.F. Lucas. Using Background Knowledge to Construct Bayesian Classifiers for Data-Poor Domains. , Pages: 269-282, Springer-Verlag, 2004.

[ PDF ] [ Bibtex ]

F.A. Grootjen, and Th.P. van der Weide. Effectiveness of Index Expressions. In: 9th International Conference on Applications of Natural Language to Information Systems (NLDB 2004), Lecture Notes in Computer Science, Vol: 3136, Pages: 171-181, 2004.

The incremental searcher satisfaction model for Information Retrieval has been introduced to capture the incremental information value of documents. In this paper, from various cognitive perspectives, searcher requirements are derived in terms of the increment function. Dirent approaches for the construction of increment functions are identified, such as the individual and the collective approach. Translating the requirements to similarity functions leads to the so-called base similarity features and the monotonicity similarity features. We show that most concrete similarity functions in IR, such as Inclusion, Jaccard.s, Dice.s, and Cosine coecient, and some other approaches to similarity functions, possess the base similarity features. The Inclusion coecientcient also satisfies the monotonicity features.

[ PDF ] [ Bibtex ]

A.J. Hommersom, P.J.F. Lucas, and M. Balser. Meta-level Verification of the Quality of Medical Guidelines using Interactive Theorem Proving. In: Logics in Artificial Intelligence, Proceedings of 9th European Conference, JELIA`04, LNCS, Vol: 3229, Pages: 654-666, 2004.

Requirements about the quality of medical guidelines can be represented using schemata borrowed from the theory of abductive diagnosis, using temporal logic to model the time-oriented aspects expressed in a guideline. In this paper, we investigate how this approach can be mapped to the facilities offered by a theorem proving system for program verification, KIV. It is shown that the reasoning that is required for checking the quality of a guideline can be mapped to such theorem-proving facilities. The medical quality of an actual guideline concerning diabetes mellitus 2 is investigated in this way, and some problems discovered are discussed.

[ PDF ] [ Bibtex ]

S.J.B.A. (Stijn) Hoppenbrouwers, and H.A. (Erik) Proper. A Communicative Perspective on Second Order Information Systems. In: Proceedings of the 16th International Conference on System Research, Informatics and Cybernetics, Baden-Baden, Germany, Edited by: G.E. Lasker. IIAS, 2004.

Abstract: We discuss our preliminary efforts to create a generic theoretical model of the socio-technical (information) systems that produce information systems (i.e. of second order information systems). We emphasize the importance of communication, language, and meta-language as factors in information system development processes and systems. Central are conversations related to the specification of information systems. Such conversations typically concern both formal and informal specification, and involve conceptual activities such as creation, adaptation, elicitation, informing, negotiation, validation, and committing. An integrated part of every specification process are conversations for conceptualization.

[ PDF ] [ Bibtex ]

R. Jurgelenaite, and P.J.F. Lucas. Exploiting Causal Independence in Large Bayesian Networks. In: Research and Developments in Intelligent Systems XXI, Edited by: M. Bramer, F. Coenen, and T. Allen. Pages: 157-170, Springer-Verlag, 2004.

[ Missing PDF ] [ Bibtex ]

R. Jurgelenaite, and P.J.F. Lucas. Parameter Estimation in Large Causal Models. In: Proceedings of the 16th ECAI 2004, Edited by: R. L`opez de M`antaras, and L. Saitta. Pages: 1037-1038, IOS Press, Amsterdam, The Netherlands, EU, 2004.

[ Missing PDF ] [ Bibtex ]

P.J.F. Lucas. A system for pacemaker treatment advice. In: Proceedings of the 16th ECAI 2004, Edited by: R. L`opez de M`antaras, and L. Saitta. Pages: 735-739, IOS Press, Amsterdam, The Netherlands, EU, 2004.

Previously it has been shown that the process of programming a cardiac pacemaker can be described in terms of the theory of diagnosis. A set-theoretical framework of diagnosis has been taken as the basis for the construction of a system for pacemaker programming that in its present form is capable of assisting cardiologists. The system has been made available commercially in 2003 by Vitatron for its C series pacemakers. In this paper, we discuss the practical requirement imposed by the clinical environment in which pacemaker programming takes place. The theory of diagnosis that has been used is briefly reviewed, after which we describe the capabilities and limitations of the implemented system. The paper is rounded off by some ideas for future development. As far as we know, this is the first system of its kind in the area of pacemakers commercially available.

[ PDF ] [ Bibtex ]

P.J.F. Lucas. Proceedings of the 2nd European Workshop on Probabilistic Graphical Models 2004 (PGM`04). Lorentz Centre, Leiden, 2004.

[ Missing PDF ] [ Bibtex ]

H.A. (Erik) Proper, and S.J.B.A. (Stijn) Hoppenbrouwers. Concept Evolution in Information System Evolution. In: Forum proceedings of the 16th Conference on Advanced Information Systems 2004 (CAiSE 2004), Riga, Latvia, EU, Riga, Latvia, EU, Edited by: J. Gravis, A. Persson, and J. Stirna. Pages: 63-72, June, Faculty of Computer Science and Information Technology, Riga, Latvia, EU, 2004, ISBN 998497670X.

We look at the evolution of information systems from the perspective of the evolution of domain languages. Many analysis and design approaches for information systems base themselves on techniques involving some sort of natural language analysis. However, the view on language underlying these approaches ignores several issues concerning the nature of language. We discuss these issues, against the background of a more linguistically viable version of the standard notion of `universe of discourse', and the notion of `environment of discourse'. We finish by sketching a direction for tackling some of the problems indicated and some initial results, centring round better organised communication about concepts (`linguistic meta-communication' and `conceptualisation').

[ PDF ] [ Bibtex ]

H.A. (Erik) Proper, A.I. Bleeker, and S.J.B.A. (Stijn) Hoppenbrouwers. Object-Role Modelling as a Domain Modelling Approach. In: Proceedings of the Workshop on Evaluating Modeling Methods for Systems Analysis and Design (EMMSAD`04), held in conjunctiun with the 16th Conference on Advanced Information Systems 2004 (CAiSE 2004),, Edited by: J. Grundspenkis, and M. Kirikova. Vol: 3, Pages: 317-328, June, Faculty of Computer Science and Information Technology, Riga, Latvia, EU, 2004, ISBN 9984976718.

This paper focuses on the potential role of the Object-Role Modeling (ORM) approach to information modeling for the task of domain modeling. Domain modeling concerns obtaining and modeling the language (concepts, terminologies, ontologies) used by stakeholders to talk about a domain. Achieving conceptual clarity and consensus among stakeholders is an important yet often neglected part of system development, and requirements engineering in particular.

This paper starts out with a brief discussion on the importance of domain modeling in system development. This is followed by an outline of the activities involved in proper domain modeling. We will then discuss why the ORM approach is, in principle, a good candidate for the tasks involved in domain modeling. This is further substantiated by a more detailed evaluation, both from a theoretical and a practical perspective.

[ PDF ] [ Bibtex ]

J.J. Sarbo. Towards a new world-ontology. , Pages: 51-58, 2004.

A crucial problem of knowledge representation is that it has to be formal and informal at the same time. Here, formal refers to knowledge as a mathematical concept which is thought, and informl to knowledge as something which is experienced, i.e. a real world phenomenon. This paper is an attempt to reconcile these potentially contradictory conditions. Our aim is to show that a cognitively based formal representation may exist, underlying traditional (formal) ontology, which may uniformly characterize the different domains of human knowledge like logic, language, mathematics, etc. We derive such a representation from the properties of cognition and signification, and elaborate its application in propositional logic and (briefly) in syntax.

[ PDF ] [ Bibtex ]

J.J. Sarbo, and J.I. Farkas. Towards a Theory of Meaning Extraction. , Edited by: H.D. Pfeiffer, H. Delugach, and K.E. Wolff. Pages: 55-68, Shaker-Verlag, Darmstadt, Germany, EU, 2004.

[ PDF ] [ Bibtex ]

B. van Gils, H.A. (Erik) Proper, P. van Bommel, and Th.P. van der Weide. Transformations in Information Supply. In: Proceedings of the Workshop on Web Information Systems Modelling (WISM`04), held in conjunctiun with the 16th Conference on Advanced Information Systems 2004 (CAiSE 2004), Riga, Latvia, EU, Edited by: J. Grundspenkis, and M. Kirikova. Vol: 3, Pages: 60-78, June, Faculty of Computer Science and Information Technology, Riga, Latvia, EU, 2004, ISBN 9984976718.

In this article, we present a model for transformation of resources in information supply. These transformations allow us to reason more flexibly about information supply, and in particular its heterogeneous nature. They allow us to change the form (e.g. report, abstract, summary) and format (e.g. PDF, DOC, HTML) of data resources found on the Web. In a retrieval context these transformations may be used to ensure that data resources are presented to the user in a form and format that is apt at that time.

[ PDF ] [ Bibtex ]

S. Visscher, P.J.F. Lucas, C. Schurink, M. Bonten, J. Wolffelaar, and P. van de Werken. Incorporating Evaluation into the Design of a Decision Support System. In: Health Continuum and Data Exchange, Pages: 54-60, IOS Press, Amsterdam, The Netherlands, EU, 2004.

[ Missing PDF ] [ Bibtex ]

Generic engine for user model based adaptation. , June, 2004.

User model based adaptation becomes more and more important in interactive systems. In this paper we first review the di erent possible adaptation models and discuss the concepts of push, pull and hybrid adaptation. We realize that there is little known explicitly hybrid adaptation. We thus propose a way to add hybrid adaptation (which also provides push and pull adaptation) to interactive systems. Consequently such interactive systems combine the advantages of push and pull adaptation in a domain dependend way.

To be able to perform research on adaptation strategies in general we have implemented an adapatation engine, which provides modular support for adding adaptation to existing or newly developed systems. It allows the authors of the programs using the adaptation to focus on the adaptation they are going to use instead of on the implementation. The adaptation engine works with pluggable adaptation models and as such it is straightforward to change the adaptation model of a system, or to test out di erent adaptation models.

Finally we give a short introduction into an adaptation model editor and an adaptation model viewer we developed for maintaining and developing adaptation models.

[ PDF ] [ Bibtex ] [ External URL ]

O. Zoeter, A. Ypma, and T. Heskes. Improved unscented Kalman smoothing for stock volatility estimation. , Edited by: A. Bassos, J. Principe, J. Larsen, T. Adali, and S. Douglas. 2004.

[ PDF ] [ Bibtex ]

Lecture-Notes

H.A. (Erik) Proper. Architecture-driven Information Systems Engineering. DaVinci Series, Institute for Information and Computing Sciences, Radboud University Nijmegen, 2004.

[ PDF ] [ Bibtex ]

Reports

P. Achten, M. van Eekelen, R. Plasmeijer, and A. van Weelden. Arrows for Generic Graphical Editor Components. Technical report: NIII-R0416, April, Radboud University Nijmegen, 2004.

GUI programming is hard, even for prototyping purposes. In this paper we present the Graphical Editor Component toolkit in which GUIs can be created in an abstract and compositional way. The basic building blocks are (Abstract) Graphical Editor Components ((A)GEC) with which the programmer can create GUIs by specification of the data models only. No low-level GUI programming is required. We show how these building blocks can be glued together conveniently using a combinator library based on the arrow combinators that have been introduced by John Hughes. The proofs of the associated arrow laws can be done with standard reasoning techniques without resorting to a dedicated semantic model.

[ PDF ] [ Bibtex ]

P. Achten, M. van Eekelen, R. Plasmeijer, and A. van Weelden. Automatic Generation of Editors for Higher-Order Data Structures. Technical report: NIII-R0427, June, Radboud University Nijmegen, 2004.

GUI programming is tedious. With generic functional programming techniques, we have constructed a programming toolkit with which one can create GUIs in an abstract and compositional way using type directed Graphical Editor Components (GECs). In this toolkit, the programmer specifies a GUI by means of a data model instead of low-level GUI programming. Due to a fundamental restriction of generic programming techniques, this data model must have a first-order type. In this paper we show that the programming toolkit can be extended in such a way that the data model can contain higher-order types, by combining the earlier developed techniques of generic GECs, abstract editors, and the higher-order shell Esther that is based on dynamic types. In principle this solution extends our GUI programming toolkit with the full expressive power of functional programming languages.

[ PDF ] [ Bibtex ]

P. Achten, M. van Eekelen, and R. Plasmeijer. Compositional Model-Views with Generic Graphical User Interfaces. Technical report: NIII-R0408, February, Radboud University Nijmegen, 2004.

Creating GUI programs is hard even for prototyping purposes. Using the model-view paradigm makes it somewhat simpler since the model-view paradigm dictates that the model contains no GUI programming, as this is done by the views. Still, a lot of GUI programming is needed to implement the views.We present a new method for constructing GUI applications that fits well in the model-view paradigm. Novel in our approach is that the views also contain no actual GUI programming. Instead, views are constructed in a fully compositional way by defining a model of the view. We use a technique developed earlier to generate the {sf GUI} part. We show how the method supports flexibility, compositionality and incremental change by introducing abstract components in the view models.

[ PDF ] [ Bibtex ]

P. Achten, M. van Eekelen, R. Plasmeijer, and A. van Weelden. GEC: a toolkit for Generic Rapid Prototyping of Type Safe Interactive Applications. Technical report: NIII-R0432, July, Radboud University Nijmegen, 2004.

GUI programming is notoriously tedious.By using generic functions that create Graphical Editor Components (GECs), it becomes possible to define user interfaces without any knowledge of low level I/O handling.A GEC editor can automatically be derived for values of any (user-defined) monomorphic (first order) type.With an editor the application user can create new values of the type the editor is created for.In this way one obtains an editor for free for almost any type. Such a free editor may not look fancy, but one can change the default look by defining specialized versions of the generic function for certain types, e.g. representing buttons, pull-down menus and the like.Furthermore, with GECs the programmer can create an abstraction level separating the view type and the domain type completely. As a result, the programmer can easily make a library of suited graphical representations and freely choose which representation to use. Consequently, working with GECs a programmer can focus on the data type representing the user interaction instead of on the nasty graphical details.Editors can be easily combined to ensure that a change in one editor has effects on the contents of others. One can combine editors by hand or use an arrow library of editor combinators.It is even possible to create editors for higher order types which enables the creation of user interfaces in which functions can be typed in by the user or read from disk at run-time. In the latter case, functions are actually compiled functions that are dynamically linked into the running application.GECs are suited for rapid prototyping of real world applications, for teaching and for debugging. This paper focuses on the use of the GEC toolkit for functional programmers, only briefly explaining its inner workings andunderlying principles.

[ PDF ] [ Bibtex ]

A. Alimarine, and S. Smetsers. Fusing Generic Functions. Technical report: NIII-R0434, August, Radboud University Nijmegen, 2004.

Generic programming is accepted by the functional programming community as a valuable tool for program development. Several functional languages have adopted the generic scheme of type-indexed values. This scheme works by specialization of a generic function to a concrete type. However, the generated code is extremely inefficient compared to its hand-written counterpart. The performance penalty is so big that the practical usefulness of generic programming is compromised. In this paper we present a optimization algorithm that is able to completely eliminate the overhead intoduced by the specialization scheme for a large class of generic functions. The presented technique is based on consumer-producer elimination as exploited by fusion, a standard general purpose optimization method. We show that our algorithm is able to optimize many practical examples of generic functions.

[ PDF ] [ Bibtex ]

A. Alimarine, and S. Smetsers. Efficient Generic Functional Programming. Technical report: NIII-R0425, June, Radboud University Nijmegen, 2004.

Generic functions are defined by induction on the structuralrepresentation of types. As a consequence, by defining just a single generic operation, one acquires this operation over any particular data type. An instance on a specific type is generated by interpretation of the types structure. A direct translation leads to extremely inefficient code that involves many conversions between types and their structural representations. In this paper we present an optimization technique based on compile-time symbolic evaluation. We prove that the optimization removes theoverhead of the generated code for a considerable class of generic functions. The proof uses typing to identify intermediate data structures that should be eliminated. In essence, the output after optimization is similar to hand-written code.

[ PDF ] [ Bibtex ]

Gerd Behrmann, Ed Brinksma, Martijn Hendriks, and Angelika Mader. Production Scheduling by Reachability Analysis -- A Case Study. Technical report: NIII-R0455, December, Radboud University Nijmegen, 2004.

Schedule synthesis based on reachability analysis of timed automata has received attention in the last few years. The main strength of this approach is that the expressiveness of timed automata allows -- unlike many classical approaches -- the modelling of scheduling problems of very different kinds. Furthermore, the models are robust against changes in the parameter setting and against changes in the problem specification. This paper presents a case study that was provided by Axxom, an industrial partner of the Ametist project. It consists of a scheduling problem for lacquer production, and is treated with the timed automata approach. A number of problems have to be addressed for the modelling task: the information transfer from the industrial partner, the derivation of timed automaton model for the case study, and the heuristics that have to be added in order to reduce the search space. We try to isolate the generic problems of modelling for model checking, and suggest solutions that are also applicable for other scheduling cases. Model checking experiments indicate that -- for this problem -- the timed automata approach is competitive with Axxoms planning tool.

[ PDF ] [ Bibtex ]

S. Bosman, and Th.P. van der Weide. Assistance for the domain modeling dialog. Technical report: NIII-R0421, Computing Science Institute, University of Nijmegen, 2004.

This paper considers the domain modeling dialog between domain expert and system analyst. In this dialog, the system analyst interprets the domain knowledge provided by the expert, and creates a formal model that captures this knowledge. As the expert may not express knowledge in a very precise way, the system analyst has to find the correct interpretation out of many possible interpretations.
In order to improve the quality of the modeling dialog, we propose a mechanism that enables the system analyst to have a better idea of the intentions of the domain expert, especially where the expert expresses these intentions poorly.

[ PDF ] [ Bibtex ]

Ling Cheung, Nancy Lynch, Roberto Segala, and F.W. Vaandrager. Switched Probabilistic I/O Automata. Technical report: NIII-R0437, September, Radboud University Nijmegen, 2004.

A switched probabilistic I/O automaton is a special kind of probabilistic I/O automaton (PIOA), enriched with an explicit mechanism to exchange control with its environment. Every closed system of switched automata satisfies the key property that, in any reachable state, at most one component automaton is active. We define a trace-based semantics for switched PIOAs and prove it is compositional. We also propose switch extensions of an arbitrary PIOA and use these extensions to define a new trace-based semantics for PIOAs.

[ PDF ] [ Bibtex ]

D.R. Cok, and J.R. Kiniry. ESC/Java2: Uniting ESC/Java and JML . Technical report: NIII-R0413, May, Radboud University Nijmegen, 2004.

The ESC/Java tool was a lauded advance in effective static checking of realistic Java programs, but has become out-of-date with respect to Java and the Java Modeling Language (JML). The ESC/Java2 project, whose progress is described in this paper, builds on the final release of ESC/Java from DEC/SRC in several ways. It parses all of JML, thus can be used with the growing body of JML-annotated Java code; it has additional static checking capabilities; and it has been designed, constructed, and documented in such a way as to improve the toolsusability to both users and researchers. It is intended thatESC/Java2 be used for further research in, and larger-scale case studies of, annotation and verification, and for studies in programmer productivity that may result from its integration with other tools that work with JML and Java.

[ Missing PDF ] [ Bibtex ]

L. Cruz-Filipe, and F. Wiedijk. Equational Reasoning in Algebraic Structures: a Complete Tactic. Technical report: NIII-R0431, July, Radboud University Nijmegen, 2004.

We present ""rational"", a Coq tactic for equational reasoning in abelian groups, commutative rings, and fields. We give an mathematical description of the method that this tactic uses, which abstracts from Coq specifics.We prove that the method that ""rational"" uses is correct, and that it is complete for groups and rings. Completeness means that the method succeeds in proving an equality if and only if that equality is provable from the the group/ring axioms. Finally we characterize in what way our method is incomplete for fields.

[ PDF ] [ Bibtex ]

P.T. de Vrieze, P. van Bommel, and Th.P. van der Weide. A Generic Adaptive Model in Adaptive Hypermedia. Technical report: NIII-R0424, May, Radboud University Nijmegen, 2004.

For adaptive hypermedia there is a strong model in form of the AHAM model and the AHA! system. This model, based on the Dexter Model, however is limited to application in hypermedia systems. In this paper we propose a new Generic Adaptivity Model. This state-machine based model can be used as the basis for adaptation in all kinds of applications. This Generic Adaptivity Model, when compared to AHAM has the following features: The Generic Adaptivity Model is more low-level than the AHAM model as it does not provide hypermedia specific concepts. Instead the GAM provides an additional Interface Model thaat describes the bindings of the adaptivity systems with the application. Finally the concepts of push and pull modelling have been incorporated in the Generic Adaptivity Model allowing system designers to make better decisions on what to store in the user model. This should allow for more flexible adaption systems.

[ PDF ] [ Bibtex ]

M. Dowse, A. Butterfield, M. van Eekelen, M. de Mol, and R. Plasmeijer. Towards Machine-Verified Proofs for I/O. Technical report: NIII-R0415, April, Radboud University Nijmegen, 2004.

In functional languages, the shape of the external world affects both our understanding of I/O and how we would wish to have I/O expressed. This paper takes the first tentative steps at examining the consequences of using an explicit model of the external world-state when reasoning (using tool-support) about the behaviour of lazy functional programs. We construct a file-system model and develop a monadic language which lets us structure I/O. Two proofs are then performed regarding the observable effect of real-world functional I/O, and it is shown how properties of the file-system lend themselves naturally to the modelling of concurrent behaviour on a single, global state. All proofs in this paper were machine-verified using the Sparkle proof-assistant.

[ PDF ] [ Bibtex ]

Ildiko Flesch, and P.J.F. Lucas. Markov Equivalence in Bayesian Networks. Technical report: NIII-R0436, August, Institute for Computing and Information Science, University of Nijmegen, 2004.

[ PDF ] [ Bibtex ]

B. Gebremichael, T. Krilavicius, and Y.S. Usenko. A Formal Analysis of a Car Periphery Supervision System. Technical report: NIII-R0418, June, Radboud University Nijmegen, 2004.

This paper presents a formal model of the real-time service allocation unit for the Car Periphery Supervision (CPS) system---a case study proposed by Robert Bosch GmbH in the context of the EU IST project AMETIST.The CPS system is a hybrid system, which is modeled in terms of timed automata. It is done by splitting the values of nonlinear continuous variables into finite set of regions and over-approximating the constraints on continuous variables into clock constraints.Safety properties of the timed model have been verified using UPPAAL. This is a sufficient condition for validating the corresponding safety properties of the initial hybrid system. The difference in time scale between the CPS componentshave also been taken care of by over-approximating the timed model using the convex-hull over-approximation feature available in UPPAAL.

[ PDF ] [ Bibtex ]

B. Gebremichael, and F.W. Vaandrager. Specifying Urgency in Timed I/O Automata. Technical report: NIII-R0459, December, Radboud University Nijmegen, 2004.

Sifakis et al advocate the use of deadline predicates for the specificationof progress properties of Alur-Dill style timed automata.In this article, we extend these ideas to a more general setting,which may serve as a basis for deductive verification techniques.More specifically, we extend the TIOA framework of Lynch et al with urgency predicates.We identify a suitable language to describe the resulting timed I/O automata with urgencyand show that for this language time reactivity holds by construction. We also establish thatthe class of timed I/O automata with urgency is closed under composition.The use of urgency predicates is compared with three alternative approachesto specifying progress properties that have been advocated in the literature:invariants, stopping conditions and deadlines predicates.We argue that in practice the use of urgency predicates leads to shorter and more naturalspecifications than any of the other approaches.Some preliminary results on proving invariant properties of timed (I/O) automata with urgency are presented.

[ PDF ] [ Bibtex ]

F.A. Grootjen, and Th.P. van der Weide. Dualistic Ontologies. Technical report: NIII-R0439, November, Radboud University Nijmegen, 2004.

To effectively use and share knowledge among AI systems, a formal specification of the representation of their shared domain of discourse -- called an ontology -- is indispensable. In this paper we demonstrate the role of dualistic ontologies in human activities such as searching, in-depth exploration and browsing.We start with the introduction of a formal definition of dualistic ontologies, based on a dual view on the universe of discourse.Then we relate this formal definition to three different (well known) kinds of ontologies, based on the vector model, on formal concept analysis and on fuzzy logic respectively. The vector model leads to concepts derived by latent semantic indexing using the singular value decomposition. Both the set model as the fuzzy set model lead to Formal Concept Analysis, in which the fuzzy set model is equipped with a parameter that controls the fine-grainedness of the resulting concepts. We discuss the relation between the resulting systems of concepts.Finally, we demonstrate the use of this theory by introducing the dual search engine. We show how this search engine can be employed to support the human activities addressed above.

[ PDF ] [ Bibtex ]

Foundations Group. Dutch Proof Tools Day 2004. Technical report: NIII-R0429, July, Radboud University Nijmegen, 2004.

These are the proceedings of the Dutch Proof Tools Day 2004, organized by the Foundations group of the NIII, University of Nijmegen. The papers contained herein give an account of the contents of the talks presented at the meeting.The Dutch Proof Tools Day is organized on an annual basis by the Protagonist group of Proof Tool users in the Netherlands (and Belgium). The aim of the day is to bring together researchers that use (or develop) proof tools to discuss the use, applicability and background of proof tools. More specifically, it provides a forum for young researchers to get in touch with other proof tools users and exchange experiences.

[ PDF ] [ Bibtex ]

M. Hendriks, B. van den Nieuwelaar, and F.W. Vaandrager. Model Checker Aided Design of a Controller for a Wafer Scanner. Technical report: NIII-R0430, June, Radboud University Nijmegen, 2004.

For a case-study of a wafer scanner from the semiconductor industry it is shown how model checking techniques can be used to compute (i) a simple yet optimal deadlock avoidance policy, and (ii) an infinite schedule that optimizes throughput. Deadlock avoidance is studied based on a simple finite state model using smv, and for throughput analysis a more detailed timed automatonmodel has been constructed and analyzed using the UPPAAL tool. The SMV and UPPAAL models are formally related through the notion of a stuttering bisimulation.The results were obtained within two weeks, which confirms once more that model checking techniques may help to improve the design process of realistic, industrial systems.Methodologically, the case study is interesting since two models (and in fact also two model checkers) were used to obtain results that could not have been obtained using only a single model (tool).

[ PDF ] [ Bibtex ]

M. Hendriks, G. Behrmann, K.G. Larsen, P. Niebert, and F.W. Vaandrager. Adding Symmetry Reduction to Uppaal. Technical report: NIII-R0407, February, Radboud University Nijmegen, 2004.

We describe a prototype extension of the Uppaal real-time model checking tool with symmetry reduction.The symmetric data type scalarset, which is also used in the Murphi model checker, was added to Uppaals system description language to support the easy static detection of symmetries. Our prototype tool uses state swaps, described and proven sound earlier by Hendriks, to reduce the space and memory consumption of Uppaal. Moreover, under certain assumptions the reduction strategy iscanonical, which means that the symmetries are optimally used. For all examples that we experimented with (both academic toy examples and industrial cases), we obtained a drastic reduction of both computation time and memory usage, exponential in the size of the scalar sets used.

[ PDF ] [ Bibtex ]

S.J.B.A. (Stijn) Hoppenbrouwers, A.I. Bleeker, and H.A. (Erik) Proper. Modeling Linguistically Complex Business Domains. Technical report: NIII-R0405, January, Radboud University Nijmegen, 2004.

The paper focuses on business domain modeling as part of requirements engineering in software development projects. Domain modeling concerns obtaining and modeling the language (concepts, terminologies, ontologies) used by stakeholders to talk about a domain. Archieving conceptual clarityand consensus among stakeholders is an important yet often neglected part of requirements engineering. Domain modeling can play a key role in supporting it. This does, however, require a nuanced approach to language aspects of domain modeling as well as ambition management concerning its goals, and the procedure followed. We provide an analysis of the linguistic complexities involved, as well as of various levels of ambition concerning the domain modeling process. On top of the ""classic"" approach to modeling of multiple terminologies within a domain, and domain evolution, we will elaborate on the first two aspects.

[ Missing PDF ] [ Bibtex ]

E.M.G.M. Hubbers, B.P.F. Jacobs, and Wolter Pieters. RIES - Internet Voting in Action. Technical report: NIII-R0449, December, Radboud University Nijmegen, 2004.

RIES stands for Rijnland Internet Election System. It is an online voting system that was developed by one of the Dutch local authorities on water management. The system has been used twice in the fall of 2004 for in total approximately two million potential voters. In this paper we describe how this system works. Furthermore we do not only describe how the outcome of the elections can be verified but also how it has been verified by us. To conclude the paper we describe some possible points for improvement.

[ PDF ] [ Bibtex ]

J. Hughes, and M. Niqui. Admissible Digit Sets and a Modified Stern--Brocot Representation. Technical report: NIII-R0401, January, Radboud University Nijmegen, 2004.

We examine a special case of admissible representations of the closed interval, namely those which arise via sequences of a finite number of M�bius transformations. We regard certain sets of M�bius transformations as a generalized notion of digits and introduce sufficient conditions that such a ""digit set'' yields an admissible representation of [0,+�]. Furthermore we establish the productivity and correctness of the homographic algorithm for such ""admissible'' digit sets. In the second part of the paper we discuss representation of positive real numbers based on the Stern--Brocot tree. We show how we can modify the usual Stern--Brocot representation to yield a ternary admissible digit set.

[ Missing PDF ] [ Bibtex ]

J.R. Kiniry. Semantic Properties for Lightweight Specification in KnowledgeableDevelopment Environments. Technical report: NIII-R0420, May, Radboud University Nijmegen, 2004.

Semantic properties are domain-specific specification constructs used to augment an existing language with richer semantics. These properties are taken advantage of in system analysis, design, implementation, testing, and maintenance through the use of documentation and source-code transformation tools. Semantic properties are themselves specified at two levels: loosely with precise natural language, and formally within the problem domain. The refinement relationships between these specification levels, as well as between a semantic property's use and its realization in program code via tools, is specified with a new formal method for reuse called kind theory.

[ Missing PDF ] [ Bibtex ]

J.R. Kiniry. Kind Theory and Software Reuse. Technical report: NIII-R0419, May, Radboud University Nijmegen, 2004.

Kind theory is a logic for describing and reasoning about structured knowledge in communities. It provides a formal framework for describing, finding, customizing, composing, and reasoning about structured domains, such as those of software and mathematics. One of the main domains studied using kind theory is that of software, particularly with regards to software as reusable knowledge aboutcomputation. This paper summarizes kind theory and several example applications of the theory for topics relating to software reuse.

[ Missing PDF ] [ Bibtex ]

P. Koopman, and M.J. Plasmeijer. Testing Reactive Systems with GAST. Technical report: NIII-R0403, January, Radboud University Nijmegen, 2004.

Gast is a fully automatic test system.Given a logical property, stated as a function, it is ableto generate appropriate test values, to execute tests with these values, and to evaluate the results of these tests.Many reactive systems, like automata and protocols,however, are specified by a model rather than in logic.There exist tools that are able to test software described by such a model based specification, but these toolshave limited capabilities to generate test data involving data types.Moreover, in these tools it is hard or even impossiblestate properties of these values in logic. In this paper we introduce some extensions of \Gast\ to combine the best of logic and model based testing.The integration of model based testing and testing based on logical properties in a single automatical system is the contribution of this paper. The system consists just of a small library instead of a huge stand alone system.

[ Missing PDF ] [ Bibtex ]

Theo Schouten, and Egon L. van den Broek. Fast Exact Euclidean Distance (FEED) Transformation. Technical report: NIII-R0456, December, Radboud University Nijmegen, 2004.

Fast Exact Euclidean Distance (FEED) transformation is introduced,starting from the inverse of the distance transformation. The prohibitive computational cost of a naive implementation of traditional Euclidean Distance Transformation, is tackled by three operations: restriction of both the number of object pixels and the number of background pixels taken in consideration and pre-computation of the Euclidean distance. Compared to the Shih and Liu 4-scan method the FEED algorithm is often faster and is less memory consuming.

[ Missing PDF ] [ Bibtex ]

Theo Schouten, Harco Kuppens, and Egon L. van den Broek. Exact Euclidean Distance (tFEED) Maps. Technical report: NIII-R0457, December, Radboud University Nijmegen, 2004.

In image and video analysis, distance maps are frequently used.They provide the (Euclidean) distance (ED) of background pixels to the nearest object pixel. In a naive implementation, each object pixel feeds its (exact) ED to each background pixel; then the minimum of these values denotes the ED to the closest object. Recently, the Fast Exact Euclidean Distance (FEED) transformation was launched, which was up to 2x faster than the fastest algorithms available. In this paper, first additional improvements to the original FEED algorithm are discussed. Next, a timed version of FEED (tFEED) is presented, which generates distance maps for video sequences by merging partial maps. For each object in a video, a partial map can be calculated for different frames, where the partial map for fixed objects is only calculated once. In a newly developed, dynamic test-environment for robot navigation purposes, tFEED proved to be up to 7x faster than using FEED on each frame separately. It is up to 4x faster than the fastest ED algorithm available for video sequences and even 40% faster than generating city-block or chamfer distance maps for frames. Hence, tFEED is the first real time algorithm for generating exact ED maps of video sequences.

[ Missing PDF ] [ Bibtex ]

F.W. Vaandrager, and A.L. de Groot. Analysis of a Biphase Mark Protocol with Uppaal and PVS. Technical report: NIII-R0445, November, Radboud University Nijmegen, 2004.

The biphase mark protocol is a convention for representing both a string of bits and clock edges in a square wave. The protocol is frequently used for communication at the physical level of the ISO/OSI hierarchy, and is implemented on microcontrollers such as the Intel 82530 Serial Communications Controller. An important property of the protocol is that bit strings of arbitrary length can be transmitted reliably, despite differences in the clock rates of sender and receiver, and variations of the clock rates (jitter), and distortion of the signal after generation of an edge. In this article, we show how the protocol can be modelled naturally in terms of timed automata. We use the model checker Uppaal to derive the maximal tolerances on the clock rates, for different instances of the protocol, and to support the general parametric verification that we formalized using the proof assistant PVS. Based on the derived parameter constraints we propose instances of BMP that are correct (at least in our model) but have a faster bit rate than the instances that are commonly implemented in hardware.

[ PDF ] [ Bibtex ]

Eva M. van Rikxoort, Egon L. van den Broek, and Theo Schouten. Mimicking human texture classification. Technical report: NIII-R0458, December, Radboud University Nijmegen, 2004.

In an attempt to mimic human (colorful) texture classification by a clustering algorithm three lines of research have been encountered, in which as test set 180 texture images (both their color and gray-scale equivalent) were drawn from the OuTex and VisTex databases. First, a k-means algorithm was applied with three feature vectors, based on color/gray values, four texture features, and their combination. Second, 18 participants clustered the images using a newly developed card sorting program. The mutual agreement between the participants was 57% and 56% and between the algorithm and the participants it was 47% and 45%, for respectively color and gray-scale texture images. Third, in a benchmark, 30 participants judged the algorithms' clusters with gray-scale textures as more homogeneous then those with colored textures. However, a high interpersonal variability was present for both the color and the gray-scale clusters. So, despite the promising results, it is questionable whether average human texture classification can be mimicked (if it exists at all).

[ Missing PDF ] [ Bibtex ]

M.C.J.D. van Eekelen, and M.J. de Mol. Mixed Lazy/Strict Natural Semantics. Technical report: NIII-R0402, January, Radboud University Nijmegen, 2004.

Explicitly enforcing strictness is often used by functional programmers as an important tool for making applications fit time and space efficiency requirements. Few functional programmers however, are familiar with the consequences of explicitly enforcing strictness for formal reasoning about their programs. Some ""folklore"" knowledge is well-known but, up to now no formal model for reasoning with enforced strictness has been available. This greatly hampered formal reasoning on mixed lazy/strict programs. This paper presents a model for formal reasoning with enforced strictness based on John Launchbury's natural lazy graph semantics. Lazy graph semantics are widely considered to be essential for lazy functional programming languages. In this paper Launchbury's semantics are extended with an explicit strict let construct. We show that such a construct extends the semantics in a natural way. We discuss 2 variants of this construct and prove correctness and adequacy of our mixed semantics for both variants.

[ Missing PDF ] [ Bibtex ]

A. van Weelden, L. Frantzen, M. Oostdijk, P. Koopman, and J. Tretmans. On-the-Fly Formal Testing of a Smart Card Applet. Technical report: NIII-R0428, June, Radboud University Nijmegen, 2004.

Smart cards are used in critical application areas. This means that their software should function correctly. Formal methods are indispensable in obtaining high quality systems. This paper presents a case study on the use of formal methods in specification-based, black-box testing of a smart card applet. The system under test is a simple electronic purse application running on a Java Card platform. The specification of the applet is given as a Statechart model, and transformed into a functional form to serve as the input for the test generation, -execution, and -analysis tool Gast. Several test runs were conducted, completely automatically, and altogether consisting of several millions of test events. The tests were applied on an (assumed to be) correct applet implementaton, as well as on some error-seeded implementations. They showed that automated, formal, specification-based testing of smart card applets is feasible, and that errors can be detected.

[ Missing PDF ] [ Bibtex ]

Gert Veldhuijzen, and Th.P. van der Weide. Query by Dialog: Maximizing Information gain in Interactive Information Disclosure. Technical report: NIII-R0447, November, Radboud University Nijmegen, 2004.

Formulating queries for search engines is a difficult task, which requires insight into the document collection that the search engine has access to.We describe an interactive approach to information disclosure, in which the search engine acts as a guide that helps the searcher find the information that she needs. The guide actively seeks to acquire information from the searcher about her information need. The dialog between searcher and guide is treated as an information exchange--the searcher acquires information about the collection, and the guide acquires information about the information need of the searcher. The guide and searcher collaborate to find a focus (subset of the document collection) that is relevant to the information need of the searcher.At each step in the dialog, the guide presents suggestions for query expansion based on the information gain that can be obtained when these suggestions are followed or rejected.

[ Missing PDF ] [ Bibtex ]

Paul de Vrieze, P. van Bommel, and Th.P. van der Weide. A Method for Incorporating User Modelling. Technical report: NIII-R0448, 2004.

In this paper a method is presented for adding user modelling to existing software systems. The method consists of seven steps that lead from initial analysis to the definition and evaluation of the elements needed for the adaptive behaviour. Further the concept of an adaptation element is introduced. Such an adaptation element can be used to determine the impact of personalisations.

[ PDF ] [ Bibtex ]

E.M.G.M. Hubbers, and E. Poll. Transactions and non-atomic API methods in Java Card: specification ambiguity and strange implementation behaviours. Technical report: NIII-R0438, October, 2004.

This paper discusses an ambiguity in Suns specification of the Java Card platform, which we noticed in the course of developing the precise formal description of the Java Card transaction mechanism presented in an earlier paper. The ambiguity concerns the Java Card transaction mechanism, more in particular the interaction of the transaction mechanism and two Java Card API methods, the methods arrayCopyNonAtomic and arrayFillNonAtomic in the class javacard.framework.Util.The paper also describes the experiments we performed with smartcards of different manufacturers (IBM and Schlumberger) to find out the behaviour actually implemented on these card. Interestingly, these experiments revealed some unexpected (and unexplainable) behaviour of these two methods on some cards.

[ PDF ] [ Bibtex ]

N.A. Lynch, R. Segala, and F.W. Vaandrager. Compositionality for Probabilistic Automata. Technical report: NIII-R0444, Radboud University Nijmegen, 2004.

We establish that on the domain of probabilistic automata, the trace distribution preorder coincides with the simulation preorder.

[ PDF ] [ Bibtex ]

Professional

H.A. (Erik) Proper. Systeemontwikkeling is Evolutie. In: Informatie, Nr: 6, Vol: 46, Pages: 44-48, 2004, In Dutch.

Traditioneel worden informatiesystemen ontwikkeld middels omvangrijke projecten. Het is echter maar de vraag inhoeverre een traditionele manier van denken in termen van projecten nog van toepassing is in onze moderne digitale samenleving. Dit artikel beoogt de achterliggende vragen nader te analyseren, en tracht de lezer aan te zetten tot nadenken over de rol van projecten in traditionele zin. Het doel van dit artikel is daarmee dus niet zo zeer om een concrete nieuwe werkwijze aan te dragen, maar eerder een discussie te starten over de essentiële uitdagingen bij informatiesysteemontwikkeling.

[ PDF ] [ Bibtex ]

H.A. (Erik) Proper. Informatiekundigen in het Digitale Tijdperk. In: .ego - Magazine voor Informatiekundigen, Nr: 2, Vol: 3, Pages: 4, SBIT, University of Tilburg, Tilburg, The Netherlands, EU, 2004, In Dutch.

De afgelopen eeuw heeft voor een groot deel in het teken gestaan van de verdere ontwikkeling en uitbouw van de industriële samenleving. Inmiddels hebben we echter de eerste schreden gezet in een nieuw tijdperk; het digitale tijdperk. Ik spreek hierbij bewust niet over "het informatietijdperk". In de laatste decennia van de vorige eeuw was regelmatig te horen hoe we het industriële tijdperk achter ons zouden laten, en over zouden gaan naar "het informatietijdperk". Naar mijn mening liepen die uitspraken zo'n honderd á honderdvijftig jaar achter de feiten aan. Mijns inziens is het informatietijdperk onlosmakelijk verbonden met de opkomst van het industriële tijdperk. Tegelijkertijd met de industrialisering, kwamen er steeds meer informatie- en documentstromen op gang om diezelfde samenleving in goede banen te leiden. Natuurlijk waren er voor de industrialisering ook al informatiestromen. Deze stromen kregen echter pas echt vorm toen de industrialisering eenmaal op gang kwam. Gaandeweg zijn er steeds meer organisaties ontstaan, zoals financiële instel lingen en regeluitvoerende instanties, die in essentie als grote informatieverwerkende fabrieken te zien zijn.

[ PDF ] [ Bibtex ]

V.E. van Reijswoud, and H.A. (Erik) Proper. Struggling in the Bush: ICT-onderwijs in Oeganda. In: TINFON, Nr: 2, Vol: 13, Pages: 59-61, June, 2004, In Dutch.

Oeganda ligt in het hart van het Afrikaanse continenent en probeert vanuit een achterstandspositie aansluiting te vinden bij de rest van de wereld. De laatste jaren zijn een aantal grote economische initiatieven gelanceerd om deze aansluiting te realiseren. ICT speelt hierin een belangrijke rol. De dominante rol van ICT in Europa en Noord Amerika noopt ook Oeganda te investeren in een betrouwbare ICT infrastructuur. Dit voornemen heeft echter nogal wat voeten in aarde omdat er nauwelijks ICT kennis en ervaring met ICT projecten in het land aanwezig is. Universiteiten proberen deze leemte op te vullen, maar hoe gaat dat als er weinig gekwalificeerde docenten zijn? In dit artikel trachten de auteurs een situatieschets te geven van ICT onderwijs in Oeganda. Hierbij zal tevens ingegaan worden op enkele van de uitdagingen die er liggen op zowel het gebied van ICT kennisontwikkeling als op het gebied van het ICT onderwijs.

[ PDF ] [ Bibtex ]

D.B.B. Rijsenbrij. Outsourcing zonder enterprise architectuur lijkt op autorijden zonder veiligheidsgordel. Technical report: NIII-R0404, Nijmegen Institute for Information and Computing Sciences, University of Nijmegen, 2004.

Door het economisch zwaar weer waarin wij verzeild zijn geraakt, zien veel financieel alerte business managers outsourcing als de oplossingsrichting bij uitstek om te komen tot significante kostenreductie. Outsourcing van (delen van) de IT functie (ITO), of zelfs totale business processen (BPO) wordt bovendien gezien als de panacee om complexe probleemsituaties en moeizame administratieprocessen buiten de deur te leggen.

Aan de andere kant vertoont het IT-landschap van de meeste ondernemingen en instellingen een chaotisch beeld ondanks de opschonende werking die het Y2Kprobleem (de millenniumovergang) en de overgang naar de euro hadden kunnen hebben. Een bruikbare enterprise architectuur, die kan dienen als stuurinstrument bij cruciale beslissingen over complexe transformaties in de onderneming, waaronder outsourcing, ontbreekt veelal.

De auteurs breken een lans voor hun opvatting dat verantwoorde outsourcing verankerd hoort te zijn in een duidelijke enterprise architectuur. Zij stellen daarom dat outsourcing zonder enterprise architectuur lijkt op autorijden zonder veiligheidsgordel. Het lijkt een tijdje gemakkelijk te zitten, maar bij een forse confrontatie van de onderneming met totaal andere marktomstandigheden, significant veranderende consumentenpatronen of nieuwe innovatieve technologische mogelijkheden zijn de negatieve gevolgen niet meer te overzien.

[ PDF ] [ Bibtex ]