CA Online, humans act both as universal everymen and as community members with their own cultural assumptions. When people transact business online, the legal and social relationships engendered place on each participant "a range of rights and responsibilities that underpin the regulation of the net as a community." (p.104) So while the interrelations may seem more complex in cyberspace, in the end establishing the relationships between key parties is still crucial to ascertaining their legal obligations, whether they are online or offline. (p.120)
Conclusions
RQ In order to ensure that evidential requirements are extended to net transactions, we must address the following questions: Are we revisiting the problems of electronic information systems without recordkeeping functionality in the cyberspace environment? Can intranet systems linked to the Net retrieve transactions with all their context intact?
CA A major future challenge for recordkeeping professionals is to maximize knowledge via the deft use of metadata as a management tool.
Phrases
<P1> Recordkeeping in the 21st century will have to confront the fact that the very definition of what constitutes a record is dynamically changing. (p.6) <P2> With the advent of the Internet and the streaming of information from the unchartered, open environment which the Internet represents, it appears that public institutions will act to consider and incorporate as part of their best practices the use of new technologies, such as digital signatures and public key encryption, to ensure that authentic and trustworthy information is captured as part of their dealings with the public at large." (p.5)
Conclusions
RQ How will we deal with the records of the future -- electronic documents with a variety of embedded, interactive attachments?
Type
Journal
Title
Building record-keeping systems: Archivists are not alone on the wild frontier
CA The digital environment offers archivists a host of new tools that can be adapted and used for recordkeeping. However, archivists must choose their tools judisciously while considering the long-term implications of their use as well as research and development. Ultimately, they must pick tools and strategies that dovetail with their institutions' specific needs while working to produce reliable and authentic records.
Phrases
<P1> Evidence from this review of emerging methods for secure and authentic electronic communications shows that the division of responsibility, accountability, and jurisdiction over recordkeeping is becoming more complex than a clear line between the records creator and the records preserver. (p.66) <P2> Storage of records in encrypted form is another area of concern because encryption adds additional levels of systems dependency on access to keys, proprietary encryption algorithims, hardware, and software. (p.62) <P3> It is important for archivists and records managers to understand parallel developments, because some new strategies and methods may support recordkeeping, while others may impede the achievement of archival objectives. (p.45) <P4> The concept of warrant and subsequent research on it by Wendy Duff is a significant contribution, because it situates the mandates for creating and maintaining records in a legal, administrative, and professional context, and it presents a methodology for locating, compiling, and presenting the rules governing proper and adequate documentation in modern organizations. (p. 48)
Conclusions
RQ Are electronic recordkeeping systems truly inherently inferior to paper-based systems in their capacity to maintain authentic records over time? How tightly can recordkeeping be integrated into normal business processes, and where does one draw the line between how a business does its work and how it does its recordkeeping?
Type
Electronic Journal
Title
The Warwick Framework: A container architecture for diverse sets of metadata
This paper is a abbreviated version of The Warwick Framework: A Container Architecture for Aggregating Sets of Metadata. It describes a container architecture for aggregating logically, and perhaps physically, distinct packages of metadata. This "Warwick Framework" is the result of the April 1996 Metadata II Workshop in Warwick U.K.
ISBN
1082-9873
Critical Arguements
CA Describes the Warwick Framework, a proposal for linking together the various metadata schemes that may be attached to a given information object by using a system of "packages" and "containers." "[Warwick Workshop] attendees concluded that ... the route to progress on the metadata issue lay in the formulation a higher-level context for the Dublin Core. This context should define how the Core can be combined with other sets of metadata in a manner that addresses the individual integrity, distinct audiences, and separate realms of responsibility of these distinct metadata sets. The result of the Warwick Workshop is a container architecture, known as the Warwick Framework. The framework is a mechanism for aggregating logically, and perhaps physically, distinct packages of metadata. This is a modularization of the metadata issue with a number of notable characteristics. It allows the designers of individual metadata sets to focus on their specific requirements, without concerns for generalization to ultimately unbounded scope. It allows the syntax of metadata sets to vary in conformance with semantic requirements, community practices, and functional (processing) requirements for the kind of metadata in question. It separates management of and responsibility for specific metadata sets among their respective "communities of expertise." It promotes interoperability by allowing tools and agents to selectively access and manipulate individual packages and ignore others. It permits access to the different metadata sets that are related to the same object to be separately controlled. It flexibly accommodates future metadata sets by not requiring changes to existing sets or the programs that make use of them."
Phrases
<P1> The range of metadata needed to describe and manage objects is likely to continue to expand as we become more sophisticated in the ways in which we characterize and retrieve objects and also more demanding in our requirements to control the use of networked information objects. The architecture must be sufficiently flexible to incorporate new semantics without requiring a rewrite of existing metadata sets. <warrant> <P2> Each logically distinct metadata set may represent the interests of and domain of expertise of a specific community. <P3> Just as there are disparate sources of metadata, different metadata sets are used by and may be restricted to distinct communities of users and agents. <P4> Strictly partitioning the information universe into data and metadata is misleading. <P5> If we allow for the fact that metadata for an object consists of logically distinct and separately administered components, then we should also provide for the distribution of these components among several servers or repositories. The references to distributed components should be via a reliable persistent name scheme, such as that proposed for Universal Resources Names (URNs) and Handles. <P6> [W]e emphasize that the existence of a reliable URN implementation is a necessary to avoid the problems of dangling references that plague the Web. <warrant> <P7> Anyone can, in fact, create descriptive data for a networked resource, without permission or knowledge of the owner or manager of that resource. This metadata is fundamentally different from that metadata that the owner of a resource chooses to link or embed with the resource. We, therefore, informally distinguish between two categories of metadata containers, which both have the same implementation [internally referenced and externally referenced metadata containers].
Conclusions
RQ "We run the danger, with the full expressiveness of the Warwick Framework, of creating such complexity that the metadata is effectively useless. Finding the appropriate balance is a central design problem. ... Definers of specific metadata sets should ensure that the set of operations and semantics of those operations will be strictly defined for a package of a given type. We expect that a limited set of metadata types will be widely used and 'understood' by browsers and agents. However, the type system must be extensible, and some method that allows existing clients and agents to process new types must be a part of a full implementation of the Framework. ... There is a need to agree on one or more syntaxes for the various metadata sets. Even in the context of the relatively simple World Wide Web, the Internet is often unbearably slow and unreliable. Connections often fail or time out due to high load, server failure, and the like. In a full implementation of the Warwick Framework, access to a "document" might require negotiation across distributed repositories. The performance of this distributed architecture is difficult to predict and is prone to multiple points of failure. ... It is clear that some protocol work will need to be done to support container and package interchange and retrieval. ... Some examination of the relationship between the Warwick Framework and ongoing work in repository architectures would likely be fruitful.
Type
Report
Title
Introduction to the Victoria Electronic Records Strategy (VERS) PROS 99/007 (Version 2)
CA VERS has two major goals: the preservation of electronic records and enabling efficient management in doing so. Version 2 has an improved structure, additional metadata elements, requirements for preservation and compliance requirements for agencies. ÔÇ£ExportÔÇØ compliance allows agencies to maintain their records within their own recordkeeping systems and add a module so they can generate the VERS format for export, especially for long term preservation. ÔÇ£NativeÔÇØ complicance is when records are converted to long term preservation format upon registration which is seen as the ideal approach.
Type
Report
Title
Victorian Electronic Records Strategy: Final Report
In July 1999, the Australian Recordkeeping Metadata Schema (RKMS) was approved by its academic and industry steering group. This metadata set now joins other community specific sets in being available for use and implementation into workplace applications. The RKMS has inherited elements from and built on many other metadata standards associated with information management. It has also contributed to the development of subsequent sector specific recordkeeping metadata sets. The importance of the RKMS as a framework for 'mapping' or reading other sets and also as a standardised set of metadata available for adoption in diverse implementation environments is now emerging. This paper explores the context of the SPIRT Recordkeeping Metadata Project, and the conceptual models developed by the SPIRT Research Team as a framework for standardising and defining Recordkeeping Metadata. It then introduces the elements of the SPIRT Recordkeeping Metadata Schema and explores its functionality before discussing implementation issues with reference to document management and workflow technologies.
Critical Arguements
CA Much of the metadata work done so far has worked off the passive assumption of records as document-like objects. Instead, they need to be seen as active entities in business transactions.
Conclusions
RQ In order to decide which elements are to be used from the RKMS, organizations need to delineate the reach of specific implementations as far as how and when records need to be bound with metadata.
Type
Web Page
Title
National States Geographic Information Council (NSGIC) Metadata Primer -- A "How To" Guide on Metadata Implementation
The primer begins with a discussion of what metadata is and why metadata is important. This is followed by an overview of the Content Standards for Digital Geospatial Metadata (CSDGM) adopted by the Federal Geographic Data Committee (FGDC). Next, the primer focuses on the steps required to begin collecting and using metadata. The fourth section deals with how to select the proper metadata creation tool from the growing number being developed. Section five discusses the mechanics of documenting a data set, including strategies on reviewing the output to make sure it is in a useable form. The primer concludes with a discussion of other assorted metadata issues.
Critical Arguements
CA The Metadata Primer is one phase of a larger metadata research and education project undertaken by the National States Geographic Information Council and funded by the Federal Geographic Data Committee's Competetive Cooperative Agreements Program (CCAP). The primer is designed to provide a practical overview of the issues associated with developing and maintaining metadata for digital spatial data. It is targeted toward an audience of state, local, and tribal government personnel. The document provides a "cook book" approach to the creation of metadata. Because much of the most current information on metadata resides on the Internet, the primer summarizes relevant material available from other World Wide Web (WWW) home pages.
Conclusions
RQ To what extent could the NSGIC recommendations be used for non-geographic applications?
SOW
DC FGDC approved the Content Standard for Digital Geospatial Metadata (FGDC-STD-001-1998) in June 1998. FGDC is a 19-member interagency committee composed of representatives from the Executive Office of the President, Cabinet-level and independent agencies. The FGDC is developing the National Spatial Data Infrastructure (NSDI) in cooperation with organizations from State, local and tribal governments, the academic community, and the private sector. The NSDI encompasses policies, standards, and procedures for organizations to cooperatively produce and share geographic data.
Type
Web Page
Title
Capturing Electronic Transactional Evidence: The Future
CA NSW has issued their metadata standard because one of the ÔÇ£key methodsÔÇØ for assuring the long-term preservation of e-records is through he use of standardized sets of recordkeeping metadata. Not only can their metadata strategy help public offices meet their individual requirements for accu