Author:
John Debenham
Affiliation:
University of Technology, Sydney, Australia
Keyword(s):
Intelligent Agents, Business Process, Agents for Internet Computing.
Related
Ontology
Subjects/Areas/Topics:
Agents
;
Artificial Intelligence
;
Artificial Intelligence and Decision Support Systems
;
Business Process Management
;
e-Business
;
Enterprise Engineering
;
Enterprise Information Systems
;
Intelligent Agents
;
Internet Technology
;
Knowledge Management and Information Sharing
;
Knowledge-Based Systems
;
Symbolic Systems
;
Web Information Systems and Technologies
Abstract:
Emergent processes are business processes whose execution is determined by the prior knowledge of the agents involved and by the knowledge that emerges during a process instance. The amount of process knowledge that is relevant to a knowledge-driven process can be enormous and may include common sense knowledge. If a process’ knowledge can not be represented feasibly then that process can not be managed; although its execution may be partially supported. In an e-market domain, the majority of transactions, including trading orders, requests for advice and information, are knowledge-driven processes for which the knowledge base is the Internet, and so representing the knowledge is not at issue. Multiagent systems are an established platform for managing complex business processes. What is needed for emergent process management is an intelligent agent that is driven not by a process goal, but by an in-flow of knowledge, where each chunk of knowledge may be uncertain. These agents shoul
d assess the extent to which it chooses to believe that the information is correct, and so they require an inference mechanism that can cope with information of differing integrity. An agent is described that achieves this by using ideas from information theory, and by using maximum entropy logic to derive integrity estimates for knowledge about which it is uncertain. Emergent processes are managed by these agents that extract the process knowledge from this knowledge base — the Internet — using a suite of data mining bots. The agents make no assumptions about the internals of the other agents in the system including their motivations, logic, and whether they are conscious of a utility function. These agents focus only on the information in the signals that they receive.
(More)