An ongoing multiple volume set with updated index.
Publisher
West Group
Publication Location
St. Paul, MN
Critical Arguements
CA "Contains over 400 separate titles on a broad range of legal topics which, taken together, systematically describe the entire field of American legal doctrine. Documents available for each topic may include a summary, topic contents, each (TRUNCATED)
Type
Book Whole
Title
Words and Phrases: All Judicial Constructions and Definitions of Words and Phrases
by the States and Federal Courts
Artiste is a European project developing a cross-collection search system for art galleries and museums. It combines image content retrieval with text based retrieval and uses RDF mappings in order to integrate diverse databases. The test sites of the Louvre, Victoria and Albert Museum, Uffizi Gallery and National Gallery London provide their own database schema for existing metadata, avoiding the need for migration to a common schema. The system will accept a query based on one museumÔÇÖs fields and convert them, through an RDF mapping into a form suitable for querying the other collections. The nature of some of the image processing algorithms means that the system can be slow for some computations, so the system is session-based to allow the user to return to the results later. The system has been built within a J2EE/EJB framework, using the Jboss Enterprise Application Server.
Secondary Title
WWW2002: The Eleventh International World Wide Web Conference
Publisher
International World Wide Web Conference Committee
ISBN
1-880672-20-0
Critical Arguements
CA "A key aim is to make a unified retrieval system which is targeted to usersÔÇÖ real requirements and which is usable with integrated cross-collection searching. Museums and Galleries often have several digital collections ranging from public access images to specialised scientific images used for conservation purposes. Access from one gallery to another was not common in terms of textual data and not done at all in terms of image-based queries. However the value of cross-collection access is recognised as important for example in comparing treatments and conditions of paintings. While ARTISTE is primarily designed for inter-museum searching it could equally be applied to museum intranets. Within a MuseumÔÇÖs intranet there may be systems which are not interlinked due to local management issues."
Conclusions
RQ "The query language for this type of system is not yet standardised but we hope that an emerging standard will provide the session-based connectivity this application seems to require due to the possibility of long query times." ... "In the near future, the project will be introducing controlled vocabulary support for some of the metadata fields. This will not only make retrieval more robust but will also facilitate query expansion. The LouvreÔÇÖs multilingual thesaurus will be used in order to ensure greater interoperability. The system is easily extensible to other multimedia types such as audio and video (eg by adding additional query items such as "dialog" and "video sequence" with appropriate analysers). A follow-up project is scheduled to explore this further. There is some scope for relating our RDF query format to the emerging query standards such as XQuery and we also plan to feed our experience into standards such as the ZNG initiative.
SOW
DC "The Artiste project is a European Commission funded collaboration, investigating the use of integrated content and metadata-based image retrieval across disparate databases in several major art galleries across Europe. Collaborating galleries include the Louvre in Paris, the Victoria and Albert Museum in London, the Uffizi Gallery in Florence and the National Gallery in London." ... "Artiste is funded by the European CommunityÔÇÖs Framework 5 programme. The partners are: NCR, The University of Southampton, IT Innovation, Giunti Multimedia, The Victoria and Albert Museum, The National Gallery, The research laboratory of the museums of France (C2RMF) and the Uffizi Gallery. We would particularly like to thank our collaborators Christian Lahanier, James Stevenson, Marco Cappellini, John Cupitt, Raphaela Rimabosci, Gert Presutti, Warren Stirling, Fabrizio Giorgini and Roberto Vacaro."
Type
Conference Proceedings
Title
Integrating Metadata Schema Registries with Digital Preservation Systems to Support Interoperability: A Proposal
There are a large number of metadata standards and initiatives that have relevance to digital preservation, e.g. those designed to support the work of national and research libraries, archives and digitization initiatives. This paper introduces some of these, noting that the developers of some have acknowledged the importance of maintaining or re-using existing metadata. It is argued here that the implementation of metadata registries as part of a digital preservation system may assist repositories in enabling the management and re-use of this metadata and may also help interoperability, namely the exchange of metadata and information packages between repositories.
Publisher
2003 Dublin Core Conference: Supporting Communities of Discourse and Practice-Metadata Research & Applications
Publication Location
Seatle, WA
Critical Arguements
CA "This paper will introduce a range of preservation metadata initiatives including the influential Open Archival Information System (OAIS) reference model and a number of other initiatives originating from national and research libraries, digitization projects and the archives community. It will then comment on the need for interoperability between these specifications and propose that the implementation of metadata registries as part of a digital preservation system may help repositories manage diverse metadata and facilitate the exchange of metadata or information packages between repositories."
Conclusions
RQ "The plethora of metadata standards and formats that have been developed to support the management and preservation of digital objects leaves us with several questions about interoperability. For example, will repositories be able to cope with the wide range of standards and formats that exist? Will they be able to transfer metadata or information packages containing metadata to other repositories? Will they be able to make use of the 'recombinant potential' of existing metadata?" ... "A great deal of work needs to be done before this registry-based approach can be proved to be useful. While it would undoubtedly be useful to have registries of the main metadata standards developed to support preservation, it is less clear how mapping-based conversions between them would work in practice. Metadata specifications are based on a range of different models and conversions often lead to data loss. Also, much more consideration needs to be given to the practical issues of implementation." 
SOW
DC Michael Day is a research officer at UKOLN, which is based at the University of Bath. He belongs to UKOLN's research and development team, and works primarily on projects concerning metadata, interoperability and digital preservation. 
Type
Conference Proceedings
Title
Preserving the Fabric of Our Lives: A Survey of Web Preservation Initiatives
This paper argues that the growing importance of the World Wide Web means that Web sites are key candidates for digital preservation. After an [sic] brief outline of some of the main reasons why the preservation of Web sites can be problematic, a review of selected Web archiving initiatives shows that most current initiatives are based on combinations of three main approaches: automatic harvesting, selection and deposit. The paper ends with a discussion of issues relating to collection and access policies, software, costs and preservation.
Secondary Title
Research and Advanced Technology for Digital Libraries, 7th European Conference, ECDL 2003, Trondheim, Norway, August 2003 Proceedings
Publisher
Springer
Publication Location
Berlin
Critical Arguements
CA "UKOLN undertook a survey of existing Web archiving initiatives as part of a feasibility study carried out for the Joint Information Systems Committee (JISC) of the UK further and higher education funding councils and the Library of the Wellcome Trust. After a brief description of some of the main problems with collecting and preserving the Web, this paper outlines the key findings of this survey." (p. 462) Addresses technical, legal and organizational challenges to archiving the World Wide Web. Surveys major attempts that have been undertaken to archive the Web, highlights the advantages and disadvantages of each, and discusses problems that remain to be addressed.
Conclusions
RQ "It is hoped that this short review of existing Web archiving initiatives has demonstrated that collecting and preserving Web sites is an interesting area of research and development that has now begun to move into a more practical implementation phase. To date, there have been three main approaches to collection, characterised in this report as 'automatic harvesting,' 'selection' and 'deposit.' Which one of these has been implemented has normally depended upon the exact purpose of the archive and the resources available. Naturally, there are some overlaps between these approaches but the current consensus is that a combination of them will enable their relative strengths to be utilised. The longer-term preservation issues of Web archiving have been explored in less detail." (p. 470)
SOW
DC OAIS emerged out of an initiative spearheaded by NASA's Consultative Committee for Space Data Systems. It has been shaped and promoted by the RLG and OCLC. Several international projects have played key roles in shaping the OAIS model and adapting it for use in libraries, archives and research repositories. OAIS-modeled repositories include the CEDARS Project, Harvard's Digital Repository, Koninklijke Bibliotheek (KB), the Library of Congress' Archival Information Package for audiovisual materials, MIT's D-Space, OCLC's Digital Archive and TERM: the Texas Email Repository Model.
Type
Conference Proceedings
Title
Place, Interface and Cyberspace: Archives at the Edge, Proceedings of the 1998 Conference of the Australian Society of Archivists
CA Online, humans act both as universal everymen and as community members with their own cultural assumptions. When people transact business online, the legal and social relationships engendered place on each participant "a range of rights and responsibilities that underpin the regulation of the net as a community." (p.104) So while the interrelations may seem more complex in cyberspace, in the end establishing the relationships between key parties is still crucial to ascertaining their legal obligations, whether they are online or offline. (p.120)
Conclusions
RQ In order to ensure that evidential requirements are extended to net transactions, we must address the following questions: Are we revisiting the problems of electronic information systems without recordkeeping functionality in the cyberspace environment? Can intranet systems linked to the Net retrieve transactions with all their context intact?
Type
Conference Proceedings
Title
Practical experiences of the Digital Preservation Testbed
CA "The Digital Preservation Testbed is researching three different approaches to long-term digital preservation: migration, emulation and XML. Not only will the effectiveness of each approach be evaluated, but also their limits, costs and application potential. Experiments are taking place on text documents, spreadsheets, emails and databases of different size, complexity and nature."
Conclusions
RQ "New experiments expected in 2002 are the migration of spreadsheets, conversion of spreadsheets and databases into XML and a proof of concept with the UVC for text documents and spreadsheets. ... Eventually at the end of 2003 the Testbed project will provide: advice on how to deal with current digital records; recommendations for an appropriate preservation strategy or a combination ofstrategies; functional requirements for a preservation function; cost models of the various preservation strategies; a decision model for preservation strategy; recommendations concerning guidelines and regulations."
SOW
DC "The Digital Preservation Testbed is part of the non-profit organisation ICTU. ICTU isthe Dutch organisation for ICT and government. ICTU's goal is to contribute to the structural development of e-government. This will result in improving the work processes of government organisations, their service to the community and interaction with the citizens. ... In case of the Digital Preservation Testbed the principals are the Ministry of the Interior, Jan Lintsen and the Dutch National Archives, Maarten van Boven. Together with Public Key Infrastructure, Digital Longevity is the fundament of the ELO-house."
CA "Ironically, electronic records systems make it both possible to more fully capture provenance than paper recrods systems did and at the same time make it more likely that provenance will be lost and that archives, even if they are preserved, will therefore lack evidential value. This paper explores the relationship between provenance and evidence and its implications for management of paper or electronic information systems." (p. 177)
Conclusions
"Electronic information systems, therefore, present at least two challenges to archivists. The first is that the designers of these systems may have chosen to document less contextual information than may be of interest to archivists when they designed the system. The second is that the data recorded in any given information system will, someday, need to be transferred to another system. ... [A]rchivists will need to return to fundamental archival principles to determine just what they really wanted to save anyway. ... It may be that archivists will be satisfied with the degree of evidential historicity they were able to achieve in paper based record systems, in which case there are very few barriers to implementing successful electronic based archival environments. Or archivists may decide that the fuller capability of tracking the actual participation of electronic data objects in organizational activities needs to be documented by archivally satisfactory information systems, in which case they will need to define those levels of evidential historicity that must be attained, and specify the systems requirements for such environments. ... At a meeting on electronic records management research issues sponsored by the National Historical Publications and Records Commission in January 1991, participants identified the concept of technological and economic plateaux in electronic data capture and archiving as an important arena for research ... Hopefully this research will produce information to help archivists make decisions regarding the amount of contextual information they can afford to capture and the requirements of systems designed to document context along with managing data content. ... I will not be surprised as we refine our concepts of evidential historicity to discover that the concept of provenance takes on even greater granularity." (p. 192-193)
CA Discusses the ways traditional archival science can inform IT, and the ways IT can help the goals of archival science be achieved more easily and efficiently.
Conclusions
<RQ> "When archivists work with information technologies or electronic archiving specialists, they have a lot to offer. They are the ones who have the conceptual key to the analysis and design of the new archiving systems." (p. 174)
This study focuses upon access to authentic electronic records that are no longer required in day-to-day operations and that have been set aside in a recordkeeping system or storage repository for future reference. One school of thought, generally associated with computer information technology specialists, holds that long-term access to electronic records is primarily a technological issue with little attention devoted to authenticity. Another school of thought, associated generally with librarians, archivists, and records managers, contends that long-term access to electronic records is as much an intellectual issue as it is a technological issue. This latter position is clearly evident in several recent research projects and studies about electronic records whose findings illuminate the discussion of long-term access to electronic records. Therefore, a review of eight research projects highlighting findings relevant for long-term access to electronic records begins this chapter. This review is followed by a discussion, from the perspective of archival science, of nine questions that a long-term access strategy must take into account. The nine issues are: What is a document?; What is a record?; What are authentic electronic records?; What does "archiving" mean?; What is an authentic reformatted electronic record?; What is a copy of an authentic electronic record?; What is an authentic converted electronic record?; What is involved in the migration of authentic electronic records?; What is technology obsolescence?
Book Title
Authentic Electronic Records: Strategies for Long-Term Access
Publisher
Cohasset Associates, Inc.
Publication Location
Chicago
ISBN
0970064004
Critical Arguements
CA "Building upon the key concepts and concerns articulated by the studies described above, this report attempts to move the discussion of long-term access to electronic records towarad more clearly identified, generally applicable and redily im(TRUNCATED)
Conclusions
RQ
SOW
DC This book chapter was written by Charles M. Dollar for Cohasset Associates, Inc. Mr. Dollar has "twenty-five years of experience in working with electronic records as a manager at the National Archives and Records Administration, as an archival educator at the University of British Columbia, and a consultant to governments and businesses in North America, Asia, Europe, and the Middle East." Cohasset Associates Inc. is "one of the nation's foremost consulting firms specializing in document-based information management."
Type
Journal
Title
Strategies for managing electronic records: A new archival paradigm? An affirmation of our archival traditions?
CA It is still too early to tell which models (like Pitt or UBC) actually work until we have had time to evaluate them. Archivists need to learn new skills in order to be effective in the electronic environment, and cannot wait for an out-of-the-box solution. Most likely, any solution will require a combination of strategies. Most of all, one must remain flexible and open to new ways of doing things.
Phrases
<P1> Unlike paper documents where context and physical form are united in a medium that provides the record of the transaction, and where relationships among documents can be observed, electronic records are not physical but are logically constructed and often "virtual" entities. Therefore, it is argued, efforts to document business transactions based on the examination of "views" or of automated forms will fail to reveal the nature of the business transactions. Consequently, methods other than direct observation and review must be employed to properly document automated systems. (p.25) <P2> System metadata typically do not contain all the information archivists need to describe electronic records, in particular, all the necessary contextual data required to understand the context of the transaction are not present. Therefore it is suggested that archivists will need to know which metadata elements are required to fully describe these records and must be in a position to add these descriptive elements to the system, preferably at the design stage. (p.26)
Conclusions
RQ What is involved in effecting a major shift from creating descriptive data to capturing, managing and adding value to system metadata?
CA A major future challenge for recordkeeping professionals is to maximize knowledge via the deft use of metadata as a management tool.
Phrases
<P1> Recordkeeping in the 21st century will have to confront the fact that the very definition of what constitutes a record is dynamically changing. (p.6) <P2> With the advent of the Internet and the streaming of information from the unchartered, open environment which the Internet represents, it appears that public institutions will act to consider and incorporate as part of their best practices the use of new technologies, such as digital signatures and public key encryption, to ensure that authentic and trustworthy information is captured as part of their dealings with the public at large." (p.5)
Conclusions
RQ How will we deal with the records of the future -- electronic documents with a variety of embedded, interactive attachments?
Type
Journal
Title
Reality and Chimeras in the Preservation of Electronic Records
CA An emulation approach is not viable for e-records preservation because it preserves the "wrong thing": systems functionality rather than records. Consequently, an emulation solution would not preserve e-records as evidence "even if it could be made to work."
Phrases
<P1>Electronic records that are not moved out of obsolete hardware and software environments are very likely to die with them. <P2> Failure to examine in detail what makes an electronic record evidence over time has led Rothenberg, and many others, to assume they want to preserve system functionality. (p.2) <P3> The state of a database at any given moment is not a record. (p.2) <P4> If we want to preserve electronic records, what we really want are records of the actual inputs and outputs from the system to be maintained as evidence over time. This does not require the information system to function as it once did. All (!) it requires is that we can capture all transactions entering and leaving the system when they are created, ensuring that the original context of their creation and content is documented, and that the requirements of evidence are preserved over time. (p.2)
Conclusions
RQ Metadata encapsualtion strategies need to identify how metadata will be captured at the time of a record's creation, how it will be stored over time while supporting the use of the record by authorized users and more generally how the recordkeeping infrastructure will be constructed and maintained.
Type
Journal
Title
Capturing records' metadata: Unresolved questions and proposals for research
The author reviews a range of the research questions still unanswered by research on the capture of metadata required for recordness. These include how to maintain inviolable linkages between records and their metadata in a variety of architectures, what structure metadata content should take, the semantics of records metadata and that of other electronic sources, how new metadata can be acquired by records over time, maintaining the meaning of contextual metadata over time, the use of metadata in records management and the design of environments in which Business Acceptable Communications ÔÇô BAC ÔÇô (those with appropriate evidential metadata) can persist.
Critical Arguements
CA "My research consists of model building which enables the construction of theories and parallel implementations based on shared assumptions. Some of these models are now being tested in applications, so this report reflects both what we do not yet know from abstract constructs and questions being generated by field testing. " ... Bearman overviews research questions such as semantics, syntax, structure and persistence of metadata that still need to be addressed.
Phrases
<P1> Records are evidence when they are bound to appropriate metadata about their content, structure and context. <P2> The metadata required for evidence is described in the Reference Model for Business Acceptable Communications (BAC). <P3> Metadata which is required for evidence must continue to be associated with the record to which it relates over time and neither it nor the record content can be alterable. <P4> To date we have only identified three implementations which, logically, could allow metadata to retain this inviolable connection. Metadata can be: kept in a common envelope WITH a record (encapsulated), bound TO a record (by integrity controls within an environment), or LINKED with a record through a technical and/or social process (registration, key deposit, etc.). <P5> Metadata content was defined in order to satisfy a range of functional requirements of records, hence it ought to have a structure which enables it to serve these functions effectively and in concrete network implementations. <warrant> <P6> Clusters of metadata are must operate together. Clusters of metadata are required by different processes which take place at different times, for different software clients, and within a variety of processes. Distinct functions will need access to specified metadata substructures and must be able to act on these appropriately. Structures have been proposed in the Reference Model for Business Acceptable Communications. <P7> Metadata required for recordness must, logically, be standard; that required for administration of recordkeeping systems is extensible and locally variable. <P8> Records metadata must be semantically homogenous but it is probably desirable for it to be syntactically heterogeneous and for a range of protocols to operate against it. Records metadata management system requirements have both an internal and external aspect; internally they satisfy management requirements while externally they satisfy on-going recordness requirements. <P9> The metadata has to come either from a specific user/session or from rules defined to extract data either from a layer in the application or a layer between the application and the recording event. <P10> A representation of the business context must exist from which the record-creating event can obtain metadata values. <P11> Structural metadata must both define the dependent structures and identify them to a records management environment which is ÔÇ£patrollingÔÇØ for dependencies which are becoming risky in the evolving environment in order to identify needs for migration. <P12> BAC conformant environments could reduce overheads and, if standards supported the uniform management of records from the point of issue to the point of receipt. Could redundancy now imposed by both paper and electronic processes be dramatically reduced if records referenced other records? <P13>
Conclusions
RQ "All the proposed methods have some degree of external dependency. What are the implications software dependencies? Encapsulation, integrity controls and technico-social process are all software dependent. Is this avoidable? Can abstract reference models of the metadata captured by these methods serve to make them effectively software independent? " ... "What are the relative overhead costs of maintaining the systems which give adequate societal assurances of records retention following any of these approaches? Are there some strategies that are currently more efficient or effective? What are the organizational requirements for implementing metadata capture systems? In particular, what would the costs of building such systems within a single institution be versus the costs of implementing records metadata adhering communications servers on a universal scale?" ... "Can we model mechanisms to enable an integrated environment of recordkeeping throughout society for all electronically communicated transactions?" ... "Are the BAC structures workable? Complete? Extensible in ways that are known to be required? For example, metadata required for ÔÇ£recordnessÔÇØ is created at the time of the creation of the records but other metadata, as premised by the Warwick Framework, 2 may be created subsequently. Are these packets of metadata orthogonal with respect to recordness? If not, how are conflicts dealt with? " ... "Not all metadata references fixed facts. Thus, for example, we have premised that proper reference to a retention schedule is a citation to an external source rather than a date given within the metadata values of a record. Similar external references are required for administration of shifting access permissions. What role can registries (especially rights clearinghouses) play in a world of electronic records? How well do existing languages for permission management map to the requirements of records administration, privacy and confidentiality protection, security management, records retention and destruction, etc." ... "Not all records will be created with equally perfect metadata. Indeed risk-based decisions taken by organizations in structuring their recordsÔÇÖ capture are likely to result in conscious decisions to exclude certain evidential metadata. What are the implications of incomplete metadata on an individual organization level and on a societal level? Does the absence of data as a result of policy need to be noted? And if so, how?" ... "Since metadata has owners, howdo owners administer recordsÔÇÖ metadata over time? In particular, since records contain records, how are the layers of metadata exposed for management and administrative needs (if internal metadata documenting dependencies can slip through the migration process, we will end up with records that cannot serve as evidence. If protected records within unprotected records are not protected, we will end up with insecure records environments, etc. etc.)." ... "In principle, the BAC could be expressed as Dublin metadata 3 and insofar as it cannot be, the Dublin metadata will be inadequate for evidence. What other syntax could be used? How could these be comparatively tested?" .. "Could Dublin Core metadata, if extended by qualifying schema, serve the requirements of recordness? Records are, after all, documents in the Dublin sense of fixed information objects. What would the knowledge representation look like?" ... "Strategies for metadata capture currently locate the source of metadata either in the API layer, or the communications system, using data provided by the application (an analysis supports defining which data and where they can be obtained), from the user interface layer, or from the business rules defined for specified types of communication pathways. Can all the required metadata be obtained by some combination of these sources? In other words, can all the metadata be acquired from sources other than content created by the record-creator for the explicit and sole purpose of documentation (since such data is both suspect in itself and the demand for it is annoying to the end user)? " ... "Does the capture of metadata from the surrounding software layers require the implementation of a business-application specific engine, or can we design generic tools that provide the means by which even legacy computing systems can create evidential records if the communication process captures the interchange arising from a record-event and binds it with appropriate metadata?" ... "What kinds of representations of business processes and structures can best carry contextualizing metadata at this level of granularity and simultaneously serve end user requirements? Are the discovery and documentation representations of provenance going to have to be different? " ... "Can a generic level of representation of context be shared? Do standards such a STEP 4 provide adequate semantic rules to enable some meaningful exchange of business context information? " ... "Using past experiences of expired standards as an indicator, can the defined structural metadata support necessary migrations? Are the formal standards of the source and target environments adequate for actual record migration to occur?" ... "What metadata is required to document a migration itself?" ... "Reduction of redundancy requires record uses to impose post-creation metadata locks on records created with different retention and access controls. To what extent is the Warwick Framework relevant to these packets and can architectures be created to manage these without their costs exceeding the savings?" ... "A number of issues about proper implementation depend on the evolution (currently very rapid) of metadata strategies in the broader Internet community. Issues such as unique identification of records, external references for metadata values, models for metadata syntax, etc. cannot be resolved for records without reference to the ways in which the wider community is addressing them. Studies that are supported for metadata capture methods need to be aware of, and flexible in reference to, such developments."
The Getty Art History Information Program: Research Agenda for Cultural Heritage on Information Networks
Publication Year
1995
Critical Arguements
CA The inability to effectively preserve and authenticate electronic records presents a significant problem for the humanities research, which depends on correct attribution and the ability to view resources long after they were created.
Phrases
<P1> Current research on software dependence and interoperability is not largely driven by archival concerns and takes a relatively short view on the requirement to preserve functionality. Little research has been done on modeling the information loss that accompanies multiple migrations or the risks inherent in the use of commercial systems before standards are developed, yet these are the critical questions being posed by archives. (p.2) <P2> The metadata required for recordness and the means to capture this data and ensure that it is bonded to electronic communications is the most significant area for research in the near future. (p.3) <P3> Within organizations, archivists must find automatic means of identifying the business process for which a record is generated. Such data modeling will become increasingly critical in an era of ongoing business re-engineering. If records are retained for their evidential significance and for a period associated with risk, then certain knowledge of their functional source is essential to their rational control. If they are retained for long-term informational value, knowledge of context is necessary to understand their significance. (p.3) <warrant>
Conclusions
RQ We need to research what value e-records have other than as a means of assessing accountability. How are they used, and what value do users derive from them? What do we need to know about a record's content to support the discovery of billions of records? How can our preservation solutions be made scaleable?
CA Makes a distinction between archival description of the record at hand and documentation of the context of its creation. Argues the importance of the latter in establishing the evidentiary value of records, and criticizes ISAD(G) for its failure to account for context. "(1) The subject of documentation is, first and foremost, the activity that generated the records, the organizations and individuals who used the records, and the purposes to which the records were put. (2). The content of the documentation must support requirements for the archival management of records, and the representations of data should support life cycle management of records. (3) The requirements of users of archives, especially their personal methods of inquiry, should determine the data values in documentation systems and guide archivists in presenting abstract models of their systems to users." (p. 45-46)
Phrases
<P1> [T]he ICA Principles rationalize existing practice -- which the author believes as a practical matter we cannot afford; which fail to provide direct access for most archives users; and which do not support the day-to-day information requirements of archivists themselves. These alternatives are also advanced because of three, more theoretical, differences with the ICA Principles: (1) In focusing on description rather than documentation, they overlook the most salient characteristic of archival records: their status as evidence. (2) In proposing specific content, they are informed by the bibliographic tradition rather than by concrete analysis of the way in which information is used in archives. (3) In promoting data value standardization without identifying criteria or principles by which to identify appropriate language or structural links between the objects represented by such terms, they fail adequately to recognize that the data representation rules they propose reflect only one particular, and a limiting, implementation. (p. 33-34) <P2> Archives are themselves documentation; hence I speak here of "documenting documentation" as a process the objective of which is to construct a value-added representation of archives, by means of strategic information capture and recording into carefully structured data and information access systems, as a mechanism to satisfy the information needs of users including archivists. Documentation principles lead to methods and practices which involve archivists at the point, and often at the time, of records creation. In contrast, archival description, as described in the ICA Principles[,] is "concerned with the formal process of description after the archival material has been arranged and the units or entities to be described have been determined." (1.7) I believe documentation principles will be more effective, more efficient and provide archivists with a higher stature in their organizations than the post accessioning description principles proposed by the ICA. <warrant> (p. 34) <P3> In the United States, in any case, there is still no truly theoretical formulation of archival description principles that enjoys a widespread adherence, in spite of the acceptance of rules for description in certain concrete application contexts. (p. 37) <P4> [T]he MARC-AMC format and library bibliographic practices did not adequately reflect the importance of information concerning the people, corporate bodies and functions that generated records, and the MARC Authority format did not support appropriate recording of such contexts and relations. <warrant> (p. 37) <P5> The United States National Archives, even though it had contributed to the data dictionary which led to the MARC content designation, all the data which it believed in 1983 that it would want to interchange, rejected the use of MARC two years later because it did not contain elements of information required by NARA for interchange within its own information systems. <warrant> (p. 37) <P6> [A]rchivists failed to understand then, just as the ISAD(G) standard fails to do now, that rules for content and data representation make sense in the context of the purposes of actual exchanges or implementation, not in the abstract, and that different rules or standards for end-products may derive from the same principles. (p. 38) <P7> After the Committee on Archival Information Exchange of the Society of American Archivists was confronted with proposals to adopt many different vocabularies for a variety of different data elements, a group of archivists who were deeply involved in standards and description efforts within the SAA formed an Ad Hoc Working Group on Standards for Archival Description (WGSAD) to identify what types of standards were needed in order to promote better description practices.  WSAD concluded that existing standards were especially inadequate to guide practice in documenting contexts of creation.  Since then, considerable progress has been made in developing frameworks for documentation, archival information systems architecture and user requirements analysis, which have been identified as the three legs on which the documenting documentation platform rests. <warrant> (p. 38) <P8> Documentation of organizational activity ought to begin long before records are transferred to archives, and may take place even before any records are created -- at the time records are created -- at the time when new functions are assigned to an organization. (p. 39) <P9> It is possible to identify records which will be created and their retention requirements before they are created, because their evidential value and informational content are essentially predetermined. (p. 39) <P10> Archivists can actively intervene through regulation and guidance to ensure that the data content and values depicting activities and functions are represented in such a way that will make them useful for subsequent management and retrieval of the records resulting from these activities. This information, together with systems documentation, defines the immediate information system context out of which the records were generated, in which they are stored, and from which they were retrieved during their active life. (p. 39) <P11> Documentation of the link between data content and the context of creation and use of the records is essential if records (archives or manuscripts) are to have value as evidence. (p. 39) <P12> [C]ontextual documentation capabilities can be dramatically improved by having records managers actively intervene in systems design and implementation.  The benefits of proactive documentation of the context of records creation, however, are not limited to electronic records; the National Archives of Canada has recently revised its methods of scheduling to ensure that such information about important records systems and contexts of records creation will be documented earlier. <warrant> (p. 39) <P13> Documentation of functions and of information systems can be conducted using information created by the organization in the course of its own activity, and can be used to ensure the transfer of records to archives and/or their destruction at appropriate times. It ensures that data about records which were destroyed as well as those which were preserved will be kept, and it takes advantage of the greater knowledge of records and the purposes and methods of day-to-day activity that exist closer to the events. (p. 40) <P14> The facts of processing, exhibiting, citing, publishing and otherwise managing records becomes significant for their meaning as records, which is not true of library materials. (p. 41) <P15> [C]ontent and data representation requirements ought to be derived from analysis of the uses to which such systems must be put, and should satisfy the day to day information requirements of archivists who are the primary users of archives, and of researchers using archives for primary evidential purposes. (p. 41) <P16> The ICA Commission proposes a principle by which archivists would select data content for archival descriptions, which is that "the structure and content of representations of archival material should facilitate information retrieval." (5.1) Unfortunately, it does not help us to understand how the Commission selected the twenty-five elements of information identified as its standard, or how we could apply the principle to the selection of additional data content. It does, however, serve as a prelude to the question of which principles should guide archivists in choosing data values in their representations. (p. 42) <P17> Libraries have found that subject access based on titles, tables of contents, abstracts, indexes and similar formal subject analysis by-products of publishing can support most bibliographic research, but the perspectives brought to materials by archival researchers are both more varied and likely to differ from those of the records creators. (p. 43) <P18> The user should not only be able to employ a terminology and a perspective which are natural, but also should be able to enter the system with a knowledge of the world being documented, without knowing about the world of documentation. (p. 44) <P19> Users need to be able to enter the system through the historical context of activity, construct relations in that context, and then seek avenues down into the documentation. This frees them from trying to imagine what records might have survived -- documentation assists the user to establish the non-existence of records as well as their existence -- or to fathom how archivists might have described records which did survive. (p. 44) <P20> When they departed from the practices of Brooks and Schellenberg in order to develop means for the construction of union catalogues of archival holdings, American archivists were not defining new principles, but inventing a simple experiment. After several years of experience with the new system, serious criticisms of it were being leveled by the very people who had first devised it. (p. 45)
Conclusions
RQ "In short, documentation of the three aspects of records creation contexts (activities, organizations and their functions, and information systems), together with representation of their relations, is essential to the concept of archives as evidence and is therefore a fundamental theoretical principle for documenting documentation. Documentation is a process that captures information about an activity which is relevant to locating evidence of that activity, and captures information about records that are useful to their ongoing management by the archival repository. The primary source of information is the functions and information systems giving rise to the records, and the principal activity of the archivist is the manipulation of data for reference files that create richly-linked structures among attributes of the records-generating context, and which point to the underlying evidence or record." (p. 46)
CA Digital information is at great risk of becoming inaccesible due to media obsolescence and deterioration. Aside from proper care of media, effective digital preservation requires records management teams that maintain metadata and schedule media migration.
Phrases
<P1> If you design the system and data standards while thinking of mutiple generations, you're in better shape. (p.25) <P2> We won't really know how long today's storage media will reliably hold data until we let it age a decade or two. And we won't see whether data is corrupted or missing until we try to read it. (p.25)
Type
Journal
Title
Migration Strategies within an Electronic Archive: Practical Experience and Future Research
Pfizer Central Research, Sandwich, England has developed an Electronic Archive to support the maintenance and preservation of electronic records used in the discovery and development of new medicines. The Archive has been developed to meet regulatory, scientific and business requirements. The long-term preservation of electronic records requires that migration strategies be developed both for the Archive and the records held within the Archive. The modular design of the Archive will facilitate the migration of hardware components. Selecting an appropriate migration strategy for electronic records requires careful project management skills allied to appraisal and retention management. Having identified when the migration of records is necessary, it is crucial that alternative technical solutions remain open.
DOI
10.1023/A:1009093604632
Critical Arguements
CA Describes a system of archiving and migration of electronic records (Electronic Archive) at Pfizer Central Research. "Our objective is to provide long-term, safe and secure storage for electronic records. The archive acts as an electronic record center and borrows much from traditional archive theory." (p. 301)
Phrases
<P1> Migration, an essential part of the life-cycle of electronic records, is not an activity that occurs in isolation. It is deeply related to the "Warrant" which justifies our record-keeping systems, and to the metadata which describe the data on our systems. (p. 301-302) <warrant> <P2> Our approach to electronic archiving, and consequently our migration strategy, has been shaped by the business requirements of the Pharmaceutical industry, the technical infrastructure in which we work, the nature of scientific research and development, and by new applications for traditional archival skills. <warrant> (p. 302) <P3> The Pharmaceutical industry is regulated by industry Good Practice Guidelines such as Good Laboratory Practice, Good Clinical Practice and GoodManufacturing Practice. Adherence to these standards is monitored by Government agencies such as the U.S. Food and Drug Administration (FDA) and in Britain the Department of Health (DoH). The guidelines require that data relating to any compound used in man be kept for the lifetime of that compound during its use in man. This we may take to be 40 years or more, during which time the data must remain identifiable and reproducible in case of regulatory inspection. <warrant> (p. 302) <P4> The record-keeping requirements of the scientific research and development process also shape migration strategies. ... Data must be able to be manipulated as well as being identifiable and legible. <warrant> (p. 303) <P5> [W]e have adapted traditional archival theory to our working environment and the new imperatives of electronic archiving. We have utilised retention scheduling to provide a vehicle for metadata file description alongside retention requirements. We have also placed great importance on appraisal as a tool to evaluate records which require to be migrated. (p. 303) <P6> Software application information is therefore collected as part of the metadata description for each file. (p. 303) <P7> The migration of the database fromone version to another or to a new schema represents a significant migration challenge in terms of the project management and validation necessary to demonstrate that a new database accurately represents our original data set. (p. 303-304) <P8> Assessing the risk of migration exercises is only one of several issues we have identified which need to be addressed before any migration of the archive or its components takes place. (p. 304) <P9> [F]ew organisations can cut themselves off totally from their existing record-keeping systems, whether they be paper or electronic. (p. 304) <P10> Critical to this model is identifying the data which are worthy of long-term preservation and transfer to the Archive. This introduces new applications for the retention and appraisal of electronic records. Traditional archival skills can be utilised in deciding which records are worthy of retention. Once they are in the Archive it will become critical to return time and again to those records in a process of "constant review" to ensure that records remain, identifiable, legible and manipulatable. (p. 305) <P11> Having decided when to migrate electronic records, it is important to decide if it is worth it. Our role in Records Management is to inform the business leaders and budget holders when a migration of electronic records will be necessary. It is also our role to provide the business with an informed decision. A key vehicle in this process will be the retention schedule, which is not simply a tool to schedule the destruction of records. It could also be used to schedule software versions. More importantly, with event driven requirements it is a vehicle for constant review and appraisal of record holdings. The Schedule also defines important parts of the metadata description for each file in the Archive. The role of appraisal is critical in evaluating record holdings from a migration point of view and will demand greater time and resources from archivists and records managers. (p. 305)
Conclusions
RQ "Any migration of electronic records must be supported by full project management. Migration of electronic records is an increasingly complex area, with the advent of relational databases, multi-dimensional records and the World Wide Web. New solutions must be found, and new research undertaken. ... To develop a methodology for the migration of electronic records demands further exploration of the role of the "warrant" both external and internal to any organisation, which underpins electronic record-keeping practices. It will become critical to find new and practical ways to identify source software applications. ... The role of archival theory, especially appraisal and retention scheduling, in migration strategies demands greater consideration. ... The issues raised by complex documents are perhaps the area which demands the greatest research for the future. In this respect however, the agenda is being set by vendors promoting new technologies with short-term business goals. It may appear that electronic records do not lend themselves to long-term preservation. ... The development, management and operation of an Electronic Archive and migration strategy demands a multitude of skills that can only be achieved by a multi-disciplinary team of user, records management, IT, and computing expertise. Reassuringly, the key factor in migrating electronic archives will remain people." (p. 306)
Type
Journal
Title
Archival Issues in Network Electronic Publications
"Archives are retained information systems that are developed according to professional principles to meet anticipated demands of user clienteles in the context of the changing conditions created by legal environments and electronic or digital technologies. This article addresses issues in electronic publishing, including authentication, mutability, reformatting, preservation, and standards from an archival perspective. To ensure continuing access to electronically published texts, a special emphasis is placed on policy planning in the development and implementation of electronic systems" (p.701).
Critical Arguements
<P1> Archives are established, administered, and evaluated by institutions, organizations, and individuals to ensure the retention, preservation, and utilization of archival holdings (p.701) <P2> The three principal categories of archival materials are official files of institutions and organizations, publications issued by such bodies, and personal papers of individuals. . . . Electronic information technologies have had profound effects on aspects of all these categories (p.702) <P3> The primary archival concern with regard to electronic publishing is that the published material should be transferred to archival custody. When the transfer occurs, the archivist must address the issues of authentication, appraisal, arrangement, description, and preservation or physical protection (p.702) <P4> The most effective way to satisfy archival requirements for handling electronic information is the establishment of procedures and standards to ensure that valuable material is promptly transferred to archival custody in a format which will permit access on equipment that will be readily available in the future (p.702) <P5> Long-term costs and access requirements are the crucial factors in determining how much information should be retained in electronic formats (p.703) <P6> Authentication involves a determination of the validity or integrity of information. Integrity requires the unbroked custody of a body of information by a responsible authority or individual <warrant> (p.703) <P7> From an archival perspective, the value of information is dependent on its content and the custodial responsibility of the agency that maintains it -- e.g., the source determines authenticity. The authentication of archival information requires that it be verified as to source, date, and content <warrant> (p.704) <P8> Information that is mutable, modifiable, or changeable loses its validity if the persons adding, altering, or deleting information cannot be identified and the time, place and nature of the changes is unknown (p.704) <P9> [P]reservation is more a matter of access to information than it is a question of survival of any physical information storage media (p.704) <P10> [T]o approach the preservation of electronic texts by focusing on physical threats will miss the far more pressing matter of ensuring continued accessibility to the information on such storage media (p.706) <P11> If the information is to remain accessible as long as paper, preservation must be a front-end, rather than an ex post facto, action (p.708) <P12> [T]he preservation of electronic texts is first and foremost a matter of editorial and administrative policy rather than of techniques and materials (p.708) <P13> Ultimately, the preservation of electronic publications cannot be solely an archival issue but an administrative one that can be addressed only if the creators and publishers take an active role in providing resources necessary to ensure that ongoing accesibility is part of initial system and product design (p.709) <P14> An encouraging development is that SGML has been considered to be a critical element for electronic publishing because of its transportability and because it supports multiple representations of a single text . . . (p.711) <P15> Underlying all questions of access is the fundamental consideration of cost (p.711)
Type
Journal
Title
The Management of Digital Data: A metadata approach
CA "Super-metadata may well play a crucial role both in facilitating access to DDOs and in providing a means of selecting and managing the maintenance of these DDOs over time."
Phrases
<P1> The preservation of the intellectual content of DDOs brings into focus a major issue: "the integrity and authenticity of the information as originally recorded" (Graham, 1997). (p.365). <P2> The emergence of dynamic and living DDOs is presenting challenges to the conventional understanding of the preservation of digital resources and is forcing many organizations to reevaluate their strategies in the light of these rapid advances in information sources. The use of appropriate metadata is recognized to be essential in ensuring continued access to dynamic and living DDOs, but the standards for such metadata are not yet fully understood or developed. (p.369)
Conclusions
RQ How can we decide what to preserve ? How can we assure long-term access? What will be the cost of electronic archiving? Which metadata schema will be in use 10 years from now, and how will migration be achieved?
CA Unless organizations find a means to preserve e-records, the long-term memory of modern institutions will be at great risk. As society now moves from written records to virtual documents, archivists are offering their traditional understanding of the structure and context of recorded evidence as protection against the widespread amnesia now threatening our electronic world.
Phrases
<P1> For electronic media, the content, structure and context of the record change significantly from that of the paper world. The only approximate match with paper is the content element, where the letters and numbers look much the same on the computer screen as on paper. But the structure and especially the context of electronic records are not apparent when retrieved from the text only. (p.4)
Type
Journal
Title
Warrant and the Defintion of Electronic Records: Questions Arising from the Pittsburgh Project
The University of Pittsburgh Electronic Recordkeeping Research Project established a model for developing functional requirements and metadata specifications based on warrant, defined as the laws, regulations, best practices, and customs that regulate recordkeeping. Research has shown that warrant can also increase the acceptance by records creators and others of functional requirements for recordkeeping. This article identifies areas related to warrant that require future study. The authors conclude by suggesting that requirements for recordkeeping may vary from country to country and industry to industry because of differing warrant.
Publisher
Kluwer Academic Publishers
Publication Location
The Netherlands
Critical Arguements
CA Poses a long series of questions and issues concerning warrant and its ability to increase the acceptance of recordkeeping requirements. Proposes that research be done to answer these questions. Discusses two different views about whether warrant can be universal and/or international.
Phrases
<P1> As we proceeded with the project [the University of Pittsburgh Electronic Recordkeeping Research Project] we ultimately turned our attention to the idea of the literary warrant -- defined as the mandate from law, professional best practices, and other social sources requiring the creation and continued maintenance of records. Wendy Duff's doctoral research found that warrant can increase the acceptance of some recordkeeping functional requirements, and therefore it has the potential to build bridges between archival professionals and others concerned with or responsible for recordkeeping. We did not anticipate the value of the literary warrant and, in the hindsight now available to us, the concept of the warrant may turn out to be the most important outcome of the project. <P2> In Wendy Duff's dissertation, legal, auditing and information science experts evluated the authority of the sources of warrant for recordkeeping. This part of the study provided evidence that information technology standards may lack authority, but this finding requires further study. Moreover, the number of individuals who evaluated the sources of warrant was extremely small. A much larger number of standards should be included in a subsequent study and a greater number of subjects are needed to evaluate these standards. <P3> We found a strong relationship between warrant and the functional requirements for electronic recordkeeping systems. Research that studies this relationship and determines the different facets that may affect it might provide more insights into the relationship between the warrant and the functional requirements. <P4> [W]e need to develop a better understanding of the degree to which the warrant for recordkeeping operates in various industries, disciplines, and other venues. Some institutions operate in a much more regulated environment than others, suggesting that the imporance of records and the understanding of records may vary considerably between institutional types, across disciplines and from country to country. <P5> We need to consider whether the recordkeeping functional requirements for evidence hold up or need to be revised for recordkeeping requirements for corporate memory, accountability, and cultural value -- the three broad realms now being used to discuss records and recordkeeping. <P6> The warrant gathered to date has primarily focused on technical, legal or the administrative value of records. A study that tested the effectiveness of warrant that supported the cultural or historical mandate of archives might help archivists gain support for their archival programs. <P7> This concern leads us to a need for more research about the understanding of records and recordkeeping in particular institutions, disciplines, and societies. <P8> A broader, and perhaps equally important question, is whether individual professionals and workers are even aware of their regulatory environment. <P9> How do the notion of the warrant and the recordkeeping functional requirements relate to the ways in which organizations work and the management tools they use, such as business process reengineering and data warehousing? <P10> What are the economic implications for organizations to comply with the functional requirements for recordkeeping in evidence? <P11> Is there a warrant and separate recordkeeping functional requirements for individual or personal recordkeeping? <P12> As more individuals, especially writers, financial leaders, and corporate and societal innovators, adopt electronic information technologies for the creation of their records, an understanding of the degree of warrant for such activity and our ability to use this warrant to manage these recordkeeping systems must be developed. <P13> We believe that archivists and records managers can imporve their image if they become experts in all aspects of recordkeeping. This will require a thorough knowledge of the legal, auditing, information technology, and management warrant for recordkeeping. <P14> The medical profession emphasizes that [sic] need to practice evidence-based medicine. We need to find out what would happen if records managers followed suit, and emphasized and practiced warrant-based recordkeeping. Would this require a major change in what we do, or would it simply be a new way to describe what we have always done? <P15> More work also has to be done on the implications of warrant and the functional requirements for the development of viable archives and records management programs. <P16> The warrant concept, along with the recordkeeping functional requirements, seem to possess immense pedagogical implications for what future archivists or practicing archivists, seeking to update their skills, should or would be taught. <P17> We need to determine the effectiveness of using the warrant and recordkeeping functional requirements as a basis for graduate archival and records management education and for developing needed topics for research by masters and doctoral students. <P18> The next generation of educational programs might be those located in other professional schools, focusing on the particular requirements for records in such institutions as corporations, hospitals, and the courts. <P19> We also need to determine the effectiveness of using the warrant and recordkeeping functional requirements in continuing education, public outreach, and advocacy for helping policy makers, resource allocators, administrators, and others to understand the importance of archives and records. Can the warrant and recordkeeping functional requirements support or foster stronger partnerships with other professions, citizen action groups, and other bodies interested in accountability in public organizations and government? <P20> Focusing on the mandate to keep and manage records, instead of the records as artifacts or intersting stuff, seems much more relevant in late twentieth century society. <P21> We need to investigate the degree to which records managers and archivists can develop a universal method for recordkeeping. ... Our laws, regulations, and best practices are usually different from country to country. Therefore, must any initiative to develop warrant also be bounded by our borders? <P22> A fundamental difference between the Pittsburgh Project and the UBC project is that UBC wishes to develop a method for managing and preserving electronic records that is applicable across all juridical systems and cultures, while the Pittsburgh Project is proposing a model that enables recordkeeping to be both universal and local at the same time. <P23> We now have a records management standard from Australia which is relevant for most North American records programs. It has been proposed as an international standard, although it is facing opposition from some European countries. Can there be an international standard for recordkeeping and can we develop one set of procedures which will be accepted across nations? Or must methods of recordkeeping be adapted to suit specific cultures, juridical systems, or industries?
Conclusions
RQ See above.
Type
Journal
Title
Six degrees of separation: Australian metadata initiatives and their relationships with international standards
CA The record used to be annotated by hand, but with the advent of electronic business the record has now become unreliable and increasingly vulnerable to loss or corruption. Metadata is part of a recordkeeping regime instituted by the NAA to address this problem.
Phrases
<P1> Electronic metadata makes the digital world go round. The digital world also works better when there are standards. Standards encourage best practice. They help the end user by encouraging the adoption of common platforms and interfaces in different systems environments. (p. 275) <P2> In relation to Web-based publishing and online service delivery, the Strategy, which has Cabinet-level endorsement, requires all government agencies to comply with metadata and recordkeeping standards issued by the NAA. (p.276) <warrant>
Conclusions
RQ How do you effectively work with software vendors and government in order to encourage metadata schema adoption and use?
SOW
DC OAIS emerged out of an initiative spearheaded by NASA's Consultative Committee for Space Data Systems. It has been shaped and promoted by the RLG and OCLC. Several international projects have played key roles in shaping the OAIS model and adapting it for use in libraries, archives and research repositories. OAIS-modeled repositories include the CEDARS Project, Harvard's Digital Repository, Koninklijke Bibliotheek (KB), the Library of Congress' Archival Information Package for audiovisual materials, MIT's D-Space, OCLC's Digital Archive and TERM: the Texas Email Repository Model.
Type
Journal
Title
Digital preservation: Where we are, where we're going, where we need to be
CA Digital preservation will begin to come into its own. The past five years were about building access; now standards are coalescing and more focus is being paid to actual preservation strategies. Major legal obstacles include the DMCA, which restricts what institutions can do to preserve digital information. There are economic challenges, and we do not really know how much digital preservation will cost.
Phrases
<P1> There will be change, there is no guarantee that you can pick a technology and stay with it for ten years. We have to have an awareness of technological change and what's coming -- we listen to peers and the larger institutions that are taking leading and bleeding edge roles, and we make wise decisions. So in this case it is OK to be trailing edge and choose something that is well-established." (p.3)
SOW
DC OAIS emerged out of an initiative spearheaded by NASA's Consultative Committee for Space Data Systems. It has been shaped and promoted by the RLG and OCLC. Several international projects have played key roles in shaping the OAIS model and adapting it for use in libraries, archives and research repositories. OAIS-modeled repositories include the CEDARS Project, Harvard's Digital Repository, Koninklijke Bibliotheek (KB), the Library of Congress' Archival Information Package for audiovisual materials, MIT's D-Space, OCLC's Digital Archive and TERM: the Texas Email Repository Model.
Over the last decade a number of writers have encouraged archivists to develop strategies and tactics to redefine their role and to insert themselves into the process of designing recordkeeping systems. This paper urges archivists to exploit the authority inherent in the laws, regulations, standards, and professional best practices that dictate recordkeeping specifications to gain great acceptance for the requirements for electronic evidence. Furthermore, it postulates that this proactive approach could assist in gaining greater respect for the archival profession.
Critical Arguements
CA The use of authoritative sources of warrant would improve acceptance of electronic records as evidence and create greater respect for the archival profession.
Phrases
<P1> The legal, administrative, fiscal, or information value of records is dependent upon the degree of trust society places in records as reliable testimony or evidence of the acts they purport to document. In turn, this trust is dependent on society's faith in the procedures that control the creation and maintenance of the record. <P2> [S]ociety bestows some methods of recordkeeping and record creating with an authority or 'warrant' for generating reliable records. <P3> David Bearman first proposed the idea of "literary warrant." <P4> [S]tatements of warrant provide clear instructions on how records should be kept and delineate elements needed for the records to be complete. <P5> The information technology field promulgates standards, but in North America adherence to them is voluntary rather than obligatory. <P6> The University of Pittsburgh Electronic Recordkeeping Project suggested that requirements for electronic recordkeeping should derive from authoritative sources, such as the law, customs, standards, and professional best practices accepted by society and codified in the literature of different professions concerned with records and recordkeeping rather than developed in isolation. <P7> On their own, archival requirements for recordkeeping have very little authority as no authoritative agencies such as standards boards or professional associations have yet to endorse them [sic] and few archivists have the authority to insist that their organizations follow them. <P8> An NHPRC study suggested that archivists have not been involved in the process of meeting the challenges of electronic records because they are undervalued by their colleagues, or, in other words, are not viewed as a credible source.
Conclusions
RQ "By highlighting the similarity between recordkeeping requirements and the requirements delineated in authoritative statements in the law, auditing standards, and professional best practices, archivists will increase the power of their message. ... If archivists are to take their rightful place as regulators of an organization's documentary requirements, they will have to reach beyond their own professional literature and understand the requirements for recordkeeping imposed by other professions and society in general. Furthermore, they will have to study methods of increasing the accpetance of their message and the impact and power of warrant."
Type
Journal
Title
Will Metadata Replace Archival Description: A Commentary
CA Before archival description can be replaced by metadata, "archivists must first study their user needs, identify processes that protect the integrity and impartiality of records, and ensure the capture of important contextual information." (p.38)
Phrases
<P1> Unfortunately, information systems often do not create records, concentrating instead on the preservation of information to the detriment of recordkeeping. Concern over this issue has lead Wallace to promote a new role for archivists, one that places them at the conception of the life cycle, establishing standards for record preservation and management as well as dictating record creation. Demarcation between archivists and records managers disappears in this new paradigm, and a new role as auditor, system designer, and regulator begins to emerge. (p.34) <P2> "Metadata are essential if archivists are to maintain the integrity and authenticity of evidence of actions. McNeil likens metadata systems to protocol registers and sees metadata itself as evidence, as well as a means of preserving evidence." (p.35)
Conclusions
RQ Will metadata replace archival description? Will metadata requirements fulfill the needs of secondary users? Will metadata require secondary descriptions?
Type
Journal
Title
Ensuring the Preservation of Reliable Evidence: A Research Project Funded by the NHPRC
CA Archivists need to propogate research projects that delineate means to engender trust and accountability for our e-records.
Phrases
<P1> "The task of preserving evidence in a hardware and software dependent environment challenges archivists to develop new techniques and new ways of of thinking about what to capture and how to preserve it. The development of the functional requirements, including the production rules, the literary warrant, and the metadata reference model, is a first step toward solving some of the most pressing problems that archivists face in the new electronic world. (p.39) <P2> As records migrate from a stable paper reality to an intangible electronic existence, their physical attributes, vital for establishing the authenticity and reliability of the evidence they contain, are threatened. (p. 29) <P3> Unfortunately, systems that create and maintain electronic records often fail to preserve the structure or the context essential for the evidentiary nature of records. (p.30)
Conclusions
RQ Can warrant increase the credibility of the functional requirements for recordkeeping? Can one type of warrant be more influential than others? Is the warrant from a person's specific profession seen by him or her as more important than others?
Type
Journal
Title
Defintions of electronic records: The European method
CA The consistent use of well-defined, agreed-upon terminology is a powerful tool for archivists. The point of view of diplomatics may be useful.
Phrases
<P1> It is very difficult for a European archivist or records manager to understand why it is necessary to use new terms to express old things. Literary warrant is one of these terms. If literary warrant simply means "best practice and professional culture" in recordkeeping, we only need to know what creators did for centuries and still do today (and probably will do also do in the future) in this area. (p. 220) <P2> Personally I am absolutely sure that without an effort at clarifying definitions in the recordkeeping environment, there is no way to obtain significant results in the field of electronic records. As the first, theoretical step, clarifying definitions will verify principles, from the juridical and technological point of view, which will be the basis of systems and particular applications. (p.220)
SOW
RQ What is the intrinsic nature of a record? How does the international community understand the concept of archival bond? Do they see it as something positive or a hindrance?
Type
Journal
Title
Law, evidence and electronic records: A strategic perspective from the global periphery
CA A recordkeeping paradigm set up around the records continuum will take us into the future, because it sees opportunities, not problems, in e-environments. It fosters accountability through evidence-generating recordkeeping practices.
Phrases
<P1> This challenge is being addressed by what Chris Hurley has called second-generation archival law, which stretches the reach of archival jurisdictions into the domain of the record-creator. A good example of such archival law is South Africa's National Archives Act of 1996, which gives the National Archives regulatory authority over all public records from the moment of their creation. The Act provides a separate definition of "electronic records systems" and accords the National Archives specific powers in relation to their management. Also significant is that the Act brings within the National Archives' jurisdiction those categories of record-creators commonly allowed exclusion -- the security establishment, public services outside formal structures of government, and "privatized" public service agencies. (p.34) <P2> A characteristic (if an absence can be a characteristic) of most archival laws, first and second generation, is a failure to define either the conditions/processes requiring "recording" or the generic attributes of a "record." (p.34) <P3> Archival law, narrowly defined, is not at the cutting edge and is an increasingly small component of broader recordkeeping regimes. This is one of the many signs of an emerging post-custodial era, which Chris Hurley speculates will be informed by a third generation of archival law. Here, the boundaries between recordkeeping domains dissolve, with all of them being controlled by universal rules. (p.34)
Conclusions
RQ What is the relationship between the event and the record? Is the idea of evidence pivotal to the concept of "recordness"? Should evidence be privileged above all else in defining a record? What about remembering, forgetting, imagining?
Type
Journal
Title
Building record-keeping systems: Archivists are not alone on the wild frontier
CA The digital environment offers archivists a host of new tools that can be adapted and used for recordkeeping. However, archivists must choose their tools judisciously while considering the long-term implications of their use as well as research and development. Ultimately, they must pick tools and strategies that dovetail with their institutions' specific needs while working to produce reliable and authentic records.
Phrases
<P1> Evidence from this review of emerging methods for secure and authentic electronic communications shows that the division of responsibility, accountability, and jurisdiction over recordkeeping is becoming more complex than a clear line between the records creator and the records preserver. (p.66) <P2> Storage of records in encrypted form is another area of concern because encryption adds additional levels of systems dependency on access to keys, proprietary encryption algorithims, hardware, and software. (p.62) <P3> It is important for archivists and records managers to understand parallel developments, because some new strategies and methods may support recordkeeping, while others may impede the achievement of archival objectives. (p.45) <P4> The concept of warrant and subsequent research on it by Wendy Duff is a significant contribution, because it situates the mandates for creating and maintaining records in a legal, administrative, and professional context, and it presents a methodology for locating, compiling, and presenting the rules governing proper and adequate documentation in modern organizations. (p. 48)
Conclusions
RQ Are electronic recordkeeping systems truly inherently inferior to paper-based systems in their capacity to maintain authentic records over time? How tightly can recordkeeping be integrated into normal business processes, and where does one draw the line between how a business does its work and how it does its recordkeeping?
Type
Journal
Title
How Do Archivists Make Electronic Archives Usable and Accessible?
CA In order to make electronic archives useable, archivists will need to enhance and link access systems to facilitate resource discovery while making the whole process as seamless and low-cost (or no-cost) as possible for the user.
Phrases
<P1> Rather than assuming that the archival community will succeed in transferring all valuable electronic records to archival institutions for preservation and future access, archivists must develop strategies and methods for accessibility and usability that can span a variety of custodial arrangements. (p.9) <P2> Maintaining linkages between different formats of materials will become increasingly burdensome if archvists do not find ways to develop integrated access systems. (p.10) <P3> Archivists must also think about ways to teach users the principles of a new digital diplomatics so that they can apply these principles themselves to make educated judgements about the accuracy, reliability, and authenticity of the documents they retrieve from electronic archives. (p.15)
Type
Journal
Title
The Making and the Keeping of Records: (1) What are Finding Aids For?
CA Context is everything. Retrospective assignations of description when records are accessioned are largely irrelevant to evidential requirements. It is only by understanding the circumstances that led to a record's creation and the changes since then that allow us to understand a record.
Phrases
<P1> We want to establish archival systems as the source for metadata needed for the recordkeeping task -- providing recordkeepers with the kind of contextualizing knowledge archivists are used to managing. This cannot happen if description remains enmeshed in collection description -- circumscribed by location of records and by their appraised value. (p.59)
Type
Journal
Title
Documenting digital images: Textual meta-data at the Blake Archive
The Electronic Library: The International Journal for Minicomputer, Microcomputer, and Software Applications in Libraries
Publication Year
1998
Volume
16
Issue
4
Pages
239
Critical Arguements
CA One of the critical issues in the future development of digital libraries is the provision of documentary metadata for non-textual electronic files.
Phrases
<P1> When libraries create digital image collections, however, documentation becomes more problematic. Even if an image is surrounded by a robust network of supporting materials, the functionality of client-server networks such as the World Wide Web permits the image to become detached from the documentary process." (p. 239)
Type
Journal
Title
When Documents Deceive: Trust and Provenance as New Factors for Information Retrieval in a Tangled Web
Journal of the American Society for Information Science and Technology
Periodical Abbreviation
JASIST
Publication Year
2001
Volume
52
Issue
1
Pages
12
Publisher
John Wiley & Sons
Critical Arguements
"This brief and somewhat informal article outlines a personal view of the changing framework for information retrieval suggested by the Web environment, and then goes on to speculate about how some of these changes may manifest in upcoming generations of information retrieval systems. It also sketches some ideas about the broader context of trust management infrastructure that will be needed to support these developments, and it points towards a number of new research agendas that will be critical during this decade. The pursuit of these agendas is going to call for new collaborations between information scientists and a wide range of other disciplines." (p. 12) Discusses public key infrastructure (PKI) and Pretty Good Practice (PGP) systems as steps toward ensuring the trustworthiness of metadata online, but explains their limitations. Makes a distinction between the identify of providers of metadata and their behavior, arguing that it is the latter we need to be concerned with.
Phrases
<P1> Surrogates are assumed to be accurate because they are produced by trusted parties, who are the only parties allowed to contribute records to these databases. Documents (full documents or surrogate records) are viewed as passive; they do not actively deceive the IR system.... Compare this to the realities of the Web environment. Anyone can create any metadata they want about any object on the net, with any motivation. (p. 13) <P2> Sites interested in manipulating the results of the indexing process rapidly began to exploit the difference between the document as viewed by the user and the document as analyzed by the indexing crawler through a set of techniques broadly called "index spamming." <P3> Pagejacking might be defined generally as providing arbitrary documents with independent arbitrary index entries. Clearly, building information retrieval systems to cope with this environment is a huge problem. (p. 14) <P4> [T]he tools are coming into place that let one determine the source of a metadata assertion (or, more precisely and more generally) the identity of the person or organization that stands behind the assertion, and to establish a level of trust in this identity. (p. 16) <P5> It is essential to recognize that in the information retrieval context one is not concerned so much with identity as with behavior. ... This distinction is often overlooked or misunderstood in discussions about what problems PKI is likely to solve: identity alone does not necessarily solve the problem of whether to trust information provided by, or warranted by, that identity. ... And all of the technology for propagating trust, either in hierarchical (PKI) or web-of-trust identity management, is purely about trust in identity. (p. 16) <P6> The question of formalizing and recording expectations about behavior, or trust in behavior, are extraordinarily complex, and as far as I know, very poorly explored. (p. 16) <P7> [A]n appeal to certification or rating services simply shifts the problem: how are these services going to track, evaluate, and rate behavior, or certify skills and behavior? (p. 16) <P8> An individual should be able to decide how he or she is willing to have identity established, and when to believe information created by or associated with such an identity. Further, each individual should be able to have this personal database evolve over time based on experience and changing beliefs. (p. 16) <P9> [T]he ability to scale and to respond to a dynamic environment in which new information sources are constantly emerging is also vital.<P10> In determining what data a user (or an indexing system, which may make global policy decisions) is going to consider in matching a set of search criteria, a way of defining the acceptable level of trust in the identity of the source of the data will be needed. (p. 16) <P10> Only if the data is supported by both sufficient trust in the identity of the source and the behavior of that identity will it be considered eligible for comparison to the search criteria. Alternatively, just as ranking of result sets provided a more flexible model of retrieval than just deciding whether documents or surrogates did or did not match a group of search criteria, one can imagine developing systems that integrate confidence in the data source (both identity and behavior, or perhaps only behavior, with trust in identity having some absolute minimum value) into ranking algorithms. (p. 17) <P11> As we integrate trust and provenance into the next generations of information retrieval systems we must recognize that system designers face a heavy burden of responsibility. ... New design goals will need to include making users aware of defaults; encouraging personalization; and helping users to understand the behavior of retrieval systems <warrant> (p. 18) <P12> Powerful paternalistic systems that simply set up trust-related parameters as part of the indexing process and thus automatically apply a fixed set of such parameters to each search submitted to the retrieval system will be a real danger. (p. 17)
Conclusions
RQ "These developments suggest a research agenda that addresses indexing countermeasures and counter-countermeasures; ways of anonymously or pseudononymously spot-checking the results of Web-crawling software, and of identifying, filtering out, and punishing attempts to manipulate the indexing process such as query-source-sensitive responses or deceptively structured pages that exploit the gap between presentation and content." (p. 14) "Obviously, there are numerous open research problems in designing such systems: how can the user express these confidence or trust constraints; how should the system integrate them into ranking techniques; how can efficient index structures and query evaluation algorithms be designed that integrate these factors. ... The integration of trust and provenance into information retrieval systems is clearly going to be necessary and, I believe, inevitable. If done properly, this will inform and empower users; if done incorrectly, it threatens to be a tremendously powerful engine of censorship and control over information access. (p. 17)
Type
Journal
Title
Metadata Strategies and Archival Description: Comparing Apples to Oranges
Advocates of a "metadata systems approach" to the description of electronic records argue that metadata's capacity to provide descriptive information about the context of electronic records creation will obviate, or reduce significantly, the need for traditional archival description. This article examines the assumptions about the nature of archival description and of metadata on which metadata strategies are grounded, for the purposes of ascertaining the following: whether the skepticism concerning the capacity of traditional description to meet the challenges posed by the so-called "second generation" of electronic records is justified; whether the use of metadata as archival description is consistent with their nature and purpose; and whether metadata are capable of servinng archival descriptive purposes.
Critical Arguements
CA "Before the archival profession assigns to traditional archival description the diminished role of "added value" (i.e. accessory) or abandons it altogether, the assumptions about the nature of archival description and of metadata on which metadata strategies are grounded ought to be carefully examined. Such an examination is necessary to ascertain the following: whether the skepticism concerning the capacity of traditional description to meet the challenges posed by the so-called "second generation" of electronic records is justified, whether the use of metadata as archival description is consistent with their nature and purpose, and whether metadata are acapable of serving archival purposes."
Phrases
<P1> In an article published in Archivaria, David Wallace summarized recent writing on the subject of metadata and concluded that "[d]ata dictionaries and the types of metadata that they house and can be built to house should be seriously evaluated by archivists" because of their potential to signficantly improve and ultimately transform traditional archival practice in the areas of appraisal, arrangement, description, reference, and access. <warrant> <P2> In the area of description, specifically, advocates of "metadata management" or a "metadata systems approach" believe that metadata's capacity to provide descriptive information about the context of electronic records creation will obviate, or reduce significantly, the need for traditional description. <P3> Charles Dollar maintains that archival participation in the IRDS standard is essential to ensure that archival requirements, including descriptive requirements, are understood and adopted within it. <warrant> <P4> According to David Wallace, "archivists will need to concentrate their efforts on metadata systems creation rather than informational content descriptions, since in the electronic realm, archivists' concern for informational value will be eclipsed by concern for the evidential value of the system." <warrant> <P5> Charles Dollar, for his part, predicts that, rather than emphasize "the products of an information system," a metadata systems approach to description will focus on "an understanding of the information system context that supports organization-wide information sharing." <P6> Because their scope and context are comparitively narrow, metadata circumscribe and atomize these various contexts of records creation. Archival description, on the other hand, enlarges and integrates them. In so doing it reveals continuities and discontinuities in the matrix of function, structure, and record-keeping over time. <P7> Metadata are part of this broader context, since they constitute a series within the creator's fonds. The partial context provided by metadata should not, however, be mistaken for the whole context. <P8> Metadata, for example, may be capable of explaining contextual attributes of the data within an electronic records system, but they are incapable of describing themselves -- i.e., their own context of creation and use -- because they cannot be detached from themselves. For this reason, it is necessary to describe the context in which the metadata are created so that their meaning also will be preserved over time. <P9> A metadata system is like a diary that, in telegraphic style, records the daily events that take place in the life of an individual as they occur and from the individual's perspective. <P10> Archival description, it could be said, is the view from the plane; metadata, the view from the field as it is plowed. <P11> While a close-up shot-- such as the capture of a database view -- may be necessary for the purposes of preserving record context and system functionality, it does not follow that such a snapshot is necessary or even desirable for the purposes of description. <P12> Because the context revealed by metadata systems is so detailed, and the volume of transactions they capture is so enormous, metadata may in fact obscure, rather than illuminate, the broader administrative context and thereby bias the users' understanding of the records' meaning. In fact, parts of actions and transactions may develop entirely outside of the electronic system and never be included in the metadata. <P13> If the metadata are kept in their entirety, users searching for documents will have to wade through a great deal of irrelevant data to find what they need. If the metadata are chopped up into bits corresponding to what has been kept, how comprehensible will they be to the uesr? <P14> The tendency to describe metadata in metaphorical terms, e.g., in relation to archival inventories, has distracted attention from consideration of what metadata are in substantial, concrete terms. They are, in fact, records created and used in the conduct of affairs of which they form a part. <P15> The transactions captured by metadata systems may be at a more microscopic level than those captured in registers and the context may be more detailed, given the technological complexity of electronic record-keeping environments. Nevertheless, their function remains the same. <P16> And, like protocol registers, whose permanent retention is legislated, metadata need to be preserved in perpetuity because they are concrete evidence of what documents were made and received, who handled them, with what results, and the transactions to which they relate. <warrant> <P17> While it is true that metadata systems show or reveal the context in which transactions occur in an electronic system and therefore constitute a kind of description of it -- Jenkinson made the same observation about registers -- their real object is to record the fact of these transactions; they should be, like registers, "preserved as a [record] of the proceedings in that connection." <P18> Viewing metadata systems as tools for achieving archival purposes, rather than as tools for achieving the creators' purposes is dangerous because it encourages us to, in effect, privilege potential secondary uses of metadata over their actual primary use; in so doing, we could reshape such use for purposes other than the conduct of affairs of which they are a part. <P19> Metadata strategies risk compromising, specifically, the impartiality of the records' creation. <P20> For archivists to introduce in the formation of metadata records requirements directed toward the future needs of archivists and researchers rather than toward the current needs of the creator would contribute an element of self-consciousness into the records creation process that is inconsistent with the preservation of the records' impartiality. <P21> If the impartiality of the metadata is compromised, their value as evidence will be compromised, which means, ultimately, that the underlying objective of metadata strategies -- the preservation of evidence -- will be defeated. <P22> None of these objections should be taken to suggest that archivists do not have a role to play in the design and maintenance of metadata systems. It is, rather, to suggest that that role must be driven by our primary obligation to protect and preserve, to the extent possible, the essential characterisitcis of the archives. <P23> The proper role of an archivist in the design of a metadata system, then, is to assist the organization in identifying its own descriptive needs as well as to ensure that the identification process is driven, not by narrowly defined system requirements, but by the organization's overarching need and obligation to create and maintain complete, reliable, and authentic records. <P24> That is why it is essential that information holdings are identified and described in a meaningful way, organized in a logical manner that fascilitates their access, and preserved in a manner that permits their continuing use. <P25> Record-keeing requirements for electronic records must address the need to render documentary relationships wisible and to build in procedures for authentication and preservation; such measures will ensure that record-keeping systems meet the criteris of "intergrity, currency an relevancy" necessary to the records creator. <P26> In other words, effective description is a consequence of effective records management and intelligent appraisal, not their purpose. If the primary objectives of metadata are met, description will be fascilitated and the need for description at lower levels (e.g., below the series level) may even be obviated. <P27> Metadata systems cannot and should not replace archival description. To meet the challenges posed by electronic records, it is more important than ever that we follow the dictates of archival science, which begin from a consideration of the nature of archives. <P28> Archival participation in the design and maintenance of metadata systems must be driven by the need to preserve them as archival documents, that is, as evidence of actions and transactions, not as descriptive tools. Our role is not to promote our own intersts, but to deepen the creator's understanding of its interests in preserving the evidence of its own actions and transactions. We can contribute to that understanding because we have a broader view of the creator's needs over time. In supporting these interests, we indirectly promote our own. <P29> To ensure that our descriptive infrastructure is sound -- that is to say, comprehensible, flexible, efficient, and effective -- we need equally to analyze our own information management methods and, out of that analysis, to develop complementary systems of administrative and intellectual control that will build upon each other. By these means we will be able to accomodate the diversity and complexity of the record-keeping environments with which we must deal.
Conclusions
RQ "Since 'current metadata systems do not account for the provenancial and contextual information needed to manage archival records,' archivists are exhorted [by Margaret Hedstrom] to direct their research efforts (and research dollars) toward the identification of the types of metadata that ought to be captured and created to meet archival descriptive requirements. "
SOW
DC Dr. Heather MacNeil is an Assistant Professor at the School of Library, Archival, and Information Studies at the University of British Columbia. Dr. MacNeilÔÇÖs major areas of interests include: trends and themes in archival research & scholarship; arrangement and description of archival documents; management of current records; trustworthiness of records as evidence; protection of personal privacy; interdisciplinary perspectives on record trustworthiness; and archival preservation of authentic electronic records
Type
Journal
Title
Challenges for service providers when importing metadata in digital libraries
CA Problems in implementing metadata for online resource discovery, in this case for digital libraries, will not be solved simply by adopting a common schema. Intellectual property rights remain another major obstacle to be dealt with.
Phrases
RQ Under what circumstances can metadata be altered? How should the copyright information of a resource be distinguished from the copyright information of its metadata? Will an audit trail be used as metadata shared with other repositories?
This article provides an overview of evolving Australian records continuum theory and the records continuum model, which is interpreted as both a metaphor and a new worldview, representing a paradigm shift in Kuhn's sense. It is based on a distillation of research findings drawn from discourse, literary warrant and historical analysis, as well as case studies, participant observation and reflection. The article traces the emergence in Australia in the 1990s of a community of practice which has taken continuum rather than life cycle based perspectives, and adopted postcustodial approaches to recordkeeping and archiving. It "places" the evolution of records continuum theory and practice in Australia in the context of a larger international discourse that was reconceptualizing traditional theory, and "reinventing" records and archives practice.
Publisher
Kluwer Academic Publishers
Publication Location
The Netherlands
Critical Arguements
CA Looks at the development of the Australian community of practice that led to records continuum theory: an approach that, in contrast to the North American life cycle approach, sees recordkeeping and archival practices as part of the same continuum of activities. Since the 1990s, there has been a lively debate between proponents of these two different ways of thinking. The second part of the article is highly theoretical, situating records continuum theory in the larger intellectual trend toward postmodernism and postpositivism.
Phrases
<P1> The model was built on a unifying concept of records inclusive of archives, which are defined as records of continuing value. It also drew on ideas about the "fixed" and "mutable" nature of records, the notion that records are ÔÇ£always in a process of becoming." (p. 334). <P2> Continuum ideas about the nature of records and archives challenge traditional understandings which differentiate "archives" from "records" on the basis of selection for permanent preservation in archival custody, and which focus on their fixed nature. Adopting a pluralist view of recorded information, continuum thinking characterises records as a special genre of documents in terms of their intent and functionality. It emphasises their evidentiary, transactional and contextual nature, rejecting approaches to the definition of records which focus on their subject content and informational value. (p. 335) <P3> [R]ecordkeeping and archiving processes ... help to assure the accessibility of meaningful records for as long as they are of value to people, organisations, and societies ÔÇô whether that be for a nanosecond or millennia. (p. 336) <P4> [I]f North American understandings of the term record keeping, based on life cycle concepts of records management, are used to interpret the writings of members of the Australian recordkeeping community, there is considerable potential for misunderstanding. <P5> Members of the recordkeeping and archiving community have worked together, often in partnership with members of other records and archives communities, on a range of national policy and standards initiatives, particularly in response to the challenge of electronic recordkeeping. These collaborative efforts resulted in AS 4390, the Australian Standard: Records Management (1996), the Australian Council of Archives' Common Framework for Electronic Recordkeeping (1996), and the Australian Records and Archives Competency Standards (1997). In a parallel and interconnected development, individual archival organisations have been developing electronic recordkeeping policies, standards, system design methodologies, and implementation strategies for their jurisdictions, including the National Archives of Australia's suite of standards, policies, and guidelines under the e-permanence initiative launched in early 2000. These developments have been deliberately set within the broader context of national standards and policy development frameworks. Two of the lead institutions in these initiatives are the National Archives of Australia and the State Records Authority of New South Wales, which have based their work in this area on exploration of fundamental questions about the nature of records and archives, and the role of recordkeeping and archiving in society. <warrant> (p. 339) <P6> In adopting a continuum-based worldview and defining its "place" in the world, the Australian recordkeeping and archiving community consciously rejected the life cycle worldview that had dominated records management and archives practice in the latter half of the 20th century in North America. ... They were also strong advocates of the nexus between accountable recordkeeping and accountability in a democratic society, and supporters of the dual role of an archival authority as both a regulator of current recordkeeping, and preserver of the collective memory of the state/nation. (p. 343-344) <P7> [P]ost-modern ideas about records view them as dynamic objects that are fixed in terms of content and meaningful elements of their structure, but linked to ever-broadening layers of contextual metadata that manages their meanings, and enables their accessibility and useability as they move through "spacetime." (p. 349) <P8> In exploring the role of recordkeeping and archiving professionals within a postcustodial frame of reference, archival theorists such as Brothman, Brown, Cook, Harris, Hedstrom, Hurley, Nesmith, and Upward have concluded that they are an integral part of the record and archive making and keeping process, involved in society's remembering and forgetting. (p. 355) <P9> Writings on the societal context of functional appraisal have gone some way to translate into appraisal policies and strategies the implications of the shifts in perception away from seeing records managers as passive keepers of documentary detritus ... and archivists as Jenkinson's neutral, impartial custodians of inherited records. (p. 355-356)
Conclusions
RQ "By attempting to define, to categorise, pin down, and represent records and their contexts of creation, management, and use, descriptive standards and metadata schema can only ever represent a partial view of the dynamic, complex, and multi-dimensional nature of records, and their rich webs of contextual and documentary relationships. Within these limitations, what recordkeeping metadata research is reaching towards are ways to represent records and their contexts as richly and extensively as possible, to develop frameworks that recognise their mutable and contingent nature, as well as the role of recordkeeping and archiving professionals (records managers and archivists) in their creation and evolution, and to attempt to address issues relating to time and space." (p. 354)
Type
Journal
Title
Accessing essential evidence on the web: Towards an Australian recordkeeping metadata standard
CA Standardized recordkeeping metadata allows for access to essential evidence of business activities and promotes reliability and authenticity. The Australian records and metadata community have been working hard to define standards and identify requirements as well as support interoperability.
Phrases
<P1> But records, as accountability traces and evidence of business activity, have additional metadata requirements. Authoritative, well-structured metadata which specifies their content, structure, context, and essential management needs must be embedded in, wrapped around and otherwise persistently linked to them from the moment they are created if they are to continue to function as evidence. (p.2) <P2> People do business in social and organizational contexts that are governed by external mandates (e.g. social mores, laws) and internal mandates (e.g. policies, business rules). Mandates establish who is responsible for what, and govern social and organizational activity, including the creation of full and accurate records. <warrant> (p.3)
Type
Journal
Title
Describing Records in Context in the Continuum: The Australian Recordkeeping Metadata Schema
CA RKMS is based on traditional recordkeeping thinking. However, it also looks to the future by viewing records as active agents of change, as intelligent information objects, which are supported by the metadata that RKMS' framework provides. Through RKMS, the dynamic world of business can be linked to the more passive world of cyberspace resource management.
Phrases
<P1> As long as records remain in the local domains in which they are created, a lot of broader contextual metadata is "in the air," carried in the minds of the corporate users of the records. When records move beyond the boundaries of the local domain in which they are created or, as is increasingly the case in networked environments, they are created in the first place in a global rather than a local domain, then this kind of metadata needs to be made explicit -- that is, captured and persistently linked to the record. This is essential so that users in the broader domain can uniquely identify, retrieve and understand the meanings of records. (p.7) <P2> The broader social context of the project is the need for individuals, society, government, and commerce to continually access the information they need to conduct their business, protect their rights and entitlements, and securely trace the trail of responsibility and action in distributed enterprises. ... Maintaining reliable, authentic and useable evidence of transactions through time and space has significant business, social, and cultural implications, as records provide essential evidence for purposes of governance, accountability, memory and identity. (p.6)
Conclusions
RQ There is a need to develop typologies of recordkeeping relationships such as agent to record and better ways to express them through metadata.
Type
Journal
Title
Towards Frameworks for Standardising Recordkeeping Metadata
CA There are many challenges to devising metadata schema to manage records over time. Continuum thinking provides a conceptual framework to identify these problems.
Phrases
<P1> It is clear from the SPIRT Project definition that recordkeeping and archival control systems have always been about capturing and managing recordkeeping metadata. (p.30) <P2> One of the keys to understanding the Project's approach to what metadata needs to be captured, persistently linked to documentation of social and business activity, and managed through space and time, lies in the continuum view of records. In continuum thinking, [records] are seen not as 'passive objects to described retrospectively,' but as agents of action, 'active participants in business processes and technologies.'" (p.37)
CA Information and communications technology (ICTs) can positively or negatively affect the availability of records for accountability. ICTs challenges basic organizational values in records management, such as finding a balance between central control and autonomy. A well-designed electronic records management plan needs to take into account the various values of an organization's "accountability situation."
Phrases
<P1> A systematic records management policy is required to ensure that the appropriate records will be available for accountability processes. This involves developing and maintaining policies, procedures, and methodologies, appointing experts, and creating and managing specialized departments." (p .260) The fact that many records are not converted and are handled exclusively in an electronic form may have an important impact on the content and quality of records kept for purposes of accountability." (p. 260)
Conclusions
RQ We must look at the tools of information and communications technology closely. Often researchers assume that tools only need to be shaped or adapted in a certain way in order to address a variety of problems in electronic records management, while ignoring the fact that the shaping of a tool is in itself highly controversial.
Type
Journal
Title
Research Issues in Australian Approaches to Policy Development
Drawing on his experience at the Australian Archives in policy development on electronic records and recordkeeping for the Australian Federal Government sector the author argues for greater emphasis on the implementation side of electronic records management. The author questions whether more research is a priority over implementation. The author also argues that if archival institutions wish to be taken seriously by their clients they need to pay greater attention to getting their own organisations in order. He suggests the way to do this is by improving internal recordkeeping practices and systems and developing a resource and skills base suitable for the delivery of electronic recordkeeping policies and services to clients.
Publisher
Kluwer Academic Publishers
Publication Location
Netherlands
Critical Arguements
CA "None of the issues which have been raised regarding the management of electronic records are insurmountable or even difficult from a technological viewpoint. The technology is there to develop electronic recordkeeping systems. The technology is there to capture and maintain electronic records. The technology is there to enable access over time. The technology is there to enable recordkeeping at a level of sophistication and accuracy hitherto undreamt of. To achieve our goal though requires more than technology, remember that is part of the problem. To achieve our goal requires human understanding, planning, input and motivation and that requires us to convince others that it is worth doing. This view has a significant impact on the development of research agendas and implementation projects." (p. 252) "Looking at electronic records from a strategic recordkeeping perspective requires us to see beyond the specific technology issues toward the wider corporate issues, within our organizational, professional and environmental sectors. In summary they are: Building alliances: nationally and internationally; Re-inventing the archival function: cultural change in the archives and recordscommunity and institutions; Getting our own house in order: establishing archival institutions as models of best practice for recordkeeping; Devoting resources to strategic developments; and Re-training and re-skilling archivists and records managers." (p. 252-253)
Phrases
<P1> The issue for me therefore is the development of a strategic approach to recordkeeping, whether it be in Society generally, whole of Government, or in your own corporate environment. The wider focus should be on the development of recordkeeping systems, and specifically electronic recordkeeping systems. Without such a strategic approach I believe our efforts on electronic records will largely be doomed to failure. (p. 252) <P2> We have to influence recordkeeping practices in order to influence the creation and management of electronic records. (p. 253) <P3> Given that there is no universal agreement within the archives and records community to dealing with electronic records how can we expect to successfully influence other sectoral interests and stake-holders, not to mention policy makers and resource providers? Institutions and Professional bodies have to work together and reach agreement and develop strategic positions. (p. 253) <P4> The emerging role of recordkeeping professionals is to define recordkeeping regimes for organizations and their employees, acting as consultants and establishing and monitoring standards, rather than deciding about specific records in specific recordkeeping systems or creating extensive documentation about them. (p. 254) <P5> Archival institutions need to practice what they preach and develop as models for best practice in recordkeeping. (p. 254-255) <P6> Resources devoted to electronic records and recordkeeping policy and implementation within archival institutions has not been commensurate with the task. (p. 255) <P7> Contact with agencies needs to be more focused at middle and senior management to ensure that the importance of accountability and recordkeeping is appreciated and that strategies and systems are put in place to ensure that records are created, kept and remain accessible. (p. 255) <P8> In order to do this for electronic records archival institutions need to work with agencies to: assist in the development of recordkeeping systems through the provision of appropriate advice; identify electronic records in their custody which are of enduring value; identify and dispose of electronic records in their custody which are not of enduring value; assist agencies in the identification of information or metadata which needs to be captured and maintained; provide advice on access to archival electronic records. (p. 255-256) <P9> The elements of the records continuum need to be reflected as components in the business strategy for archival institutions in the provision of services to its clients. (p. 256)
Conclusions
RQ "In summary I see the unresolved issues and potential research tasks as follows: International Agreement (UN, ICA); National Agreement (Government, Corporate, Sectoral, Professional); Cultural Change in the Archives and Records Community; Re-inventing / re-engineering Archives institutions; Re-training or recruiting; Best practice sites -- the National Archives as a model for best practice recordkeeping; Test sites for creation, capture, migration and networking of records; Functional analysis and appraisal of electronic information systems (electronic recordkeeping systems); Costing the retention of electronic records and records in electronic form." (p. 257)
Type
Journal
Title
Grasping the Nettle: The Evolution of Australian Archives Electronic Records Policy
CA An overview of the development of electronic records policy at the Australian Archives.
Phrases
<P1> The notion of records being independent of format and of "virtual" records opens up a completely new focus on what it is that archival institutions are attempting to preserve. (p. 136) <P2> The import of Bearman's contention that not all infomation systems are recordkeeping systems challenges archivists to move attention away from managing archival records after the fact toward involvement in the creation phase of records, i.e., in the systems design and implementation process. (p. 139) <P3> The experience of the Australian Archives is but one slice of a very large pie, but I think it is a good indication of the challenges other institutions are facing internationally. (p. 144)
Conclusions
RQ How has the Australian Archives managed the transition from paper to electronic records? What issues were raised and how were they dealt with?
CA The want for hard unassailable recordkeeping rules ignores the fact that recordkeeping is contingent upon unique needs of each organization as far as acceptable risks and context. Reed argues that aiming to achieve basic agreement on a minimal set of metadata attributes is an important start.
Phrases
<P1> Recordkeeping must be tailored to the requirements of specific business functions and activities linked to related social and legal requirements, incorporated into particular business processes, and maintained through each change to those processes. (p. 222) <P2> A record core or metadata set which lacks such specificity, detailing only requirements for a unique identifier, will not support interpretation of the record outside the creating domain. To enable that, we need more detailed specification of the domain itself, data which is redundant when you know where you are, but essential to understanding and interpreting records where the domain is not explicit. (p. 229)
Conclusions
RQ To establish requirements for viable core elements, the big challenge is the issue of time and that data will change over time ÔÇöespecially as far as individual competence, business function and language.
CA Through training, it is advisable to educate employees on the distinctions between data mangement functions and how they relate to electronic records management.
Phrases
<P1> Much of the archival practice in the management of electronic records until the last few years, and of the literature describing this practice, has been concerned with managing databases as machine-readable or electronic records. Applying the transaction/evidence test to many databases indicates that they should not be regarded as records in this sense, operating as electronic information systems rather than as electronic recordkeeping systems, to use the neat distinction made by Bearman and his colleagues. (p.18) <P2> Where the essentially evidential quality of a record is not accepted, that is, where records are simply equated with recorded information, the distinction between records and documents tends to disappear . . . a document may be distinguished from a record by the latter's evidential quality in documenting the transaction of business. (p.19)
One such expedient could be more structured and more integrated use of formal and institutional data on records and archives. I cannot offer any completed model of this enhanced perspective, and as far as I know, one does not exist. However, it is a new way of thinking and looking at the problems we encounter. What I would like to do is draw attention to some of the approaches now being developed in The Netherlands. In a way, this presentation will therefore be a report on the Dutch arvhival situation.
Critical Arguements
CA "In a world defined by the enormous size of archives, where the multiplicity of records is in turn driven by the growing complexity of society and its administration, and by the proliferation of types of 'information carriers', it is becoming increasingly difficult fpr archivists to fulfill their primary tasks. It is therefore necessary to study carefully the development of maintenance and control mechanisms for archives. We cannot afford waste or overlook any possibility. It is also necessary to look around us, to discover what other archivists in other countries are doing, and what others in related fields, such as libraries and museums, have accomplished. Essentially, we all deal with the same problems and must try to find new solutions to master these problems."
Phrases
<P1> Document forms can be regarded as forms of objects. We probably need to gain more experience in recognizing different forms of documents and interpreting them, but once we have this knowledge, we can use it in the same way as we now use 'form' in its archival sense: to distinguish one object from another. <P2> In fact, by extension, one can even construct and defend the thesis that all decisions in an administration are reached using standard procedures and forms. Once this is realized, one can ask: what use to archivists make of this knowledge in their daily work? What are the possibilities? <P3> Often the forms of materials created prove to be of a more consistent nature than the offices that use them. If an office ceases its activity, another will take over its tasks and for the most part will use the same or almost the same forms of material. <P4> Understanding the functions of the organization will provide archivists not only with information about the material involved, but also with knowledge of the procedures, which in turn provides information about the records and their different forms. This kind of sympathetic understanding enables archivists to make all kinds of decisions, and it is important to note that at least part of this knowledge should be provided to the users, so that they can decide which records might be of interest to them. <warrant> <P5> We are increasingly aware that we must distinguish between processing an archive (i.e. organizing records according to archival principles after appraisal) and making the contents available for users through finding aids, indexes and other means. <P6> With respect to the latter, it is clear that archivists should make use of both context- and provenance-based indexing. They should take advantage of the possiblities offered by the structures and forms of material -- something which the librarian cannot do. Furthermore, they should also use content indexing in a selective way, only when they think it necessary [to] better serve researchers. <warrant> <P7> The National Archives in The Hague has responded to these new perspectives by developing a computer programme called MAIS (Micro Archives Inventory System), which is a formal way of processing archives based on provenance. <P8> The object of this presentation has been to show that use of structure, forms of material and functions, can aid the archivist in his/her work.
Conclusions
RQ "While these initial Dutch efforts have been produced in a rather unorganized way, it should nevertheless be possible to approach the work more systematically in [the] future, building up a body of knowledge of forms for users of archives. David Bearman has offered some preliminary suggestions in this direction, in the article cited above; it is now a matter of more research required to realize something positive in this field."
SOW
DC J. Peter Sigmond is Director of Collections at the Rijksmuseum in Amsterdam, the Netherlands
Type
Journal
Title
Structuring the Records Continuum Part Two: Structuration Theory and Recordkeeping
In the previous issue of Archives and Manuscripts I presented the first part of this two part exploration. It dealt with some possible meanings for 'post' in the term postcustodial. For archivists, considerations of custody are becoming more complex because of changing social, technical and legal considerations. These changes include those occurring in relation to access and the need to document electronic business communications reliably. Our actions, as archivists, in turn become more complex as we attempt to establish continuity of custody in electronic recordkeeping environments. In this part, I continue the case for emphasising the processes of archiving in both our theory and practice. The archives as a functional structure has dominated twentieth century archival discourse and institutional ordering, but we are going through a period of transformation. The structuration theory of Anthony Giddens is used to show that there are very different ways of theorising about our professional activities than have so far been attempted within the archival profession. Giddens' theory, at the very least, provides a useful device for gaining insights into the nature of theory and its relationship with practice. The most effective use of theory is as a way of seeing issues. When seen through the prism of structuration theory, the forming processes of the virtual archives are made apparent.
Critical Arguements
CA "This part of my exploration of the continuum will continue the case for understanding 'postcustodial' as a bookmark term for a major transition in archival practice. That transition involves leaving a long tradition in which continuity was a matter of sequential control. Electronic recordkeeping processes need to incorporate continuity into the essence of recordkeeping systems and into the lifespan of documents within those systems. In addressing this issue I will present a structurationist reading of the model set out in Part 1, using the sophisticated theory contained in the work of Anthony Giddens. Structuration theory deals with process, and illustrates why we must constantly re-assess and adjust the patterns for ordering our activities. It gives some leads on how to go about re-institutionalising these new patterns. When used in conjunction with continuum thinking, Giddens' meta-theory and its many pieces can help us to understand the complexities of the virtual archives, and to work our way towards the establishment of suitable routines for the control of document management, records capture, corporate memory, and collective memory."
Phrases
<P1> Broadly the debate has started to form itself as one between those who represent the structures and functions of an archival institution in an idealised form, and those who increasingly concentrate on the actions and processes which give rise to the record and its carriage through time and space. In one case the record needs to be stored, recalled and disseminated within our institutional frameworks; in the other case it is the processes for storing, recalling, and disseminating the record which need to be placed into a suitable framework. <P2> Structure, for Giddens, is not something separate from human action. It exists as memory, including the memory contained within the way we represent, recall, and disseminate resources including recorded information. <P3> Currently in electronic systems there is an absence of recordkeeping structures and disconnected dimensions. The action part of the duality has raced ahead of the structural one; the structuration process has only just begun. <P4> The continuum model's breadth and richness as a conceptual tool is expanded when it is seen that it can encompass action-structure issues in at least three specialisations within recordkeeping: contemporary recordkeeping - current recordkeeping actions and the structures in which they take place; regulatory recordkeeping - the processes of regulation and the enabling and controlling structures for action such as policies, standards, codes, legislation, and promulgation of best practices; historical recordkeeping - explorations of provenance in which action and structure are examined forensically as part of the data sought about records for their storage, recall and dissemination. <P5> The capacity to imbibe information about recordkeeping practices in agencies will be crucial to the effectiveness of the way archival 'organisations' set up their postcustodial programs. They will have to monitor the distribution and exercise of custodial responsibilities for electronic records from before the time of their creation. <warrant> <P6> As John McDonald has pointed out, recordkeeping activities need to occur at desktop level within systems that are not dependent upon the person at the desktop understanding all of the details of the operation of that system. <P7> Giddens' more recent work on reflexivity has many parallels with metadata approaches to recordkeeping. What if the records, as David Bearman predicts, can be self-managing? Will they be able to monitor themselves? <P8> He rejects the life cycle model in sociology, based on ritualised passages through life, and writes of 'open experience thresholds'. Once societies, for example, had rites for coming of age. Coming of age in a high modern society is now a complex process involving a host of experiences and risks which are very different to that of any previous generation. Open experience threshholds replace the life cycle thresholds, and as the term infers, are much less controlled or predictable. <P9> There is a clear parallel with recordkeeping in a high modern environment. The custodial thresholds can no longer be understood in terms of the spatial limits between a creating agency and an archives. The externalities of the archives as place will decline in significance as a means of directly asserting the authenticity and reliability of records. The complexities of modern recordkeeping involve many more contextual relationships and an ever increasing network of relationships between records and the actions that take place in relation to them. We have no need for a life cycle concept based on the premise of generational repetition of stages through which a record can be expected to pass. We have entered an age of more recordkeeping choices and of open experience thresholds. <P10> It is the increase in transactionality, and the technologies being used for those transactions, which are different. The solution, easier to write about than implement, is for records to parallel Giddens' high modern individual and make reflexive use of the broader social environment in which they exist. They can reflexively monitor their own action and, with encoding help from archivists and records managers, resolve their own crises as they arise. <warrant> <P11> David Bearman's argument that records can be self-managing goes well beyond the easy stage. It is supported by the Pittsburgh project's preliminary set of metadata specifications. The seeds of self-management can be found in object oriented programming, java, applets, and the growing understanding of the importance and nature of metadata. <P12> Continuum models further assist us to conceive of how records, as metadata encapsulated objects, can resolve many of their own life crises as they thread their way through time and across space. <P13> To be effective monitors of action, archival institutions will need to be recognised by others as the institutions most capable of providing guidance and control in relation to the integration of the archiving processes involved in document management, records capture, the organisation of corporate memory and the networking of archival systems. <warrant> <P14> Signification, in the theoretical domain, refers to our interpretative schemes and the way we encode and communicate our activities. At a macro level this includes language itself; at a micro level it can include our schemes for classification and ordering. <P15> The Pittsburgh project addressed the three major strands of Giddens' theoretical domain. It explored and set out functional requirements for evidence - signification. It sought literary warrants for archival tasks - legitimation. It reviewed the acceptability of the requirements for evidence within organisational cultures - domination. <P16> In Giddens' dimensional approach, the theoretical domain is re-defined to be about coding, organising our resources, and developing norms and standards. In this area the thinking has already begun to produce results, which leads this article in to a discussion of structural properties. <P17> Archivists deal with structural properties when, for example, they analyse the characteristics of recorded information such as the document, the record, the archive and the archives. The archives as a fortress is an observable structural property, as is the archives as a physical accumulation of records. Within Giddens' structuration theory, when archivists write about their favourite features, be they records or the archives as a place, they are discussing structural properties. <P18> Postcustodial practice in Australia is already beginning to put together a substantial array of structural properties. These developments are canvassed in the article by O'Shea and Roberts in the previous issue of Archives and Manuscripts. They include policies and strategies, standards, recordkeeping regimes, and what has come to be termed distributed custody. <P19> As [Terry] Eastwood comments in the same article, we do not have adequate electronic recordkeeping systems. Without them there can be no record in time-space to serve any form of accountability. <warrant> <P20> In the Pittsburgh project, for example, the transformation of recordkeeping processes is directed towards the creation and management of evidence, and possible elements of a valid rule-resource set have emerged. Elements can include the control of recordkeeping actions, accountability, the management of risk, the development of recordkeeping regimes, the establishment of recordkeeping requirements, and the specification of metadata. <P21> In a postcustodial approach it is the role of archival institutions to foster better recordkeeping practices within all the dimensions of recordkeeping. <warrant>
Conclusions
RQ "Best practice in the defence of the authoritative qualities of records can no longer be viewed as a linear chain, and the challenge is to establish new ways of legitimating responsibilities for records storage and custody which recognise the shifts which have occurred." ... "The recordkeeping profession should seek to establish itself as ground cover, working across terrains rather than existing tree-like in one spot. Beneath the ground cover there are shafts of specialisation running both laterally and vertically. Perhaps we can, as archivists, rediscover something that a sociologist like Giddens has never forgotten. Societies, including their composite parts, are the ultimate containers of recorded information. As a place in society, as Terry Cook argues, the archives is a multiple reality. We can set in train policies and strategies that can help generate multiplicity without losing respect for particular mine shafts. Archivists have an opportunity to pursue policies which encourage the responsible exercising of a custodial role throughout society, including the professions involved in current, regulatory and historical recordkeeping. If we take up that opportunity, our many goals can be better met and our concerns will be addressed more effectively."
SOW
DC "Frank Upward is a senior lecturer in the Department of Librarianship, Archives and Records at Monash University. He is an historian of the ideas contained in the Australian records continuum approach, and an ex- practitioner within that approach." ... "These two articles, and an earlier one on Ian Maclean and the origins of Australian continuum thinking, have not, so far, contained appropriate acknowledgements. David Bearman provided the necessary detonation of certain archival practices, and much more. Richard Brown and Terry Cook drew my attention to Anthony Giddens' work and their own work has helped shape my views. I have many colleagues at Monash who encourage my eccentricities. Sue McKemmish has helped shape my ideas and my final drafts and Barbara Reed has commented wisely on my outrageous earlier drafts. Livia Iacovino has made me stop and think more about the juridical tradition in recordkeeping. Chris Hurley produced many perspectives on the continuum during the 1996 seminars which have helped me see the model more fully. Don Schauder raised a number of key questions about Giddens as a theorist. Bruce Wearne of the Sociology Department at Monash helped me lift the clarity of my sociological explanations and made me realise how obsessed Giddens is with gerunds. The structural-functionalism of Luciana Duranti and Terry Eastwood provided me with a counterpoint to many of my arguments, but I also owe them debts for their respective explorations of recordkeeping processes and the intellectual milieu of archival ideas, and for their work on the administrative-juridical tradition of recordkeeping. Glenda Acland has provided perceptive comments on my articles - and supportive ones, for which I am most grateful given how different the articles are from conventional archival theorising. Australian Archives, and its many past and present staff members, has been important to me."
Type
Journal
Title
Structuring the Records Continuum Part One: Post-custodial principles and properties
The records continuum is becoming a much used term, but has seldom been defined in ways which show it is a time/space model not a life of the records model. Dictionary definitions of a continuum describe such features as its continuity, the indescernibility of its parts, and the way its elements pass into each other. Precise definitions, accordingly, have to discern the indiscernible, identify points that are not distinct, and do so in ways which accomodate the continuity of change. This article, and a second part to be published in the next volume, will explore the continuum in time/space terms supported by a theoretical mix of archival science, postmodernity and the 'structuration theory' of Anthony Giddens. In this part the main objectives are to give greater conceptual firmness to the continuum; to clear the way for broader considerations of the nature of the continuum by freeing archivists from the need to debate custody; to show how the structural principles for archival practice are capable of different expression without losing contact with something deeper that can outlive the manner of expression.
Critical Arguements
CA "This is the first instalment of a two part article exploring the records continuum. Together the articles will build into a theory about the constitution of the virtual archives. In this part I will examine what it can mean to be 'postcustodial', outline some possible structural principles for the virtual archives, and present a logical model for the records continuum." ... "In what follows in the remainder of this article (and all of the next) , I will explore the relevance of [Anthony] Giddens' theory to the structuring of the records continuum."
Phrases
<P1> If the archival profession is to avoid a fracture along the lines of paper and electronic media, it has to be able to develop ways of expressing its ideas in models of relevance to all ages of recordkeeping, but do so in ways which are contemporaneous with our own society. <warrant> <P2> We need more of the type of construct provided by the Pittsburgh Project's functional requirements for evidence which are 'high modern' but can apply to recordkeeping over time. <P3> What is essential is for electronic records to be identified, controlled and accessible for as long as they have value to Government and the Community. <warrant> <P4> We have to face up to the complexification of ownership, possession, guardianship and control within our legal system. Even possession can be broken down into into physical possession and constructed possession. We also have to face the potential within our technology for ownership, possession, custody or control to be exercised jointly by the archives, the organisation creating the records, and auditing agencies. The complexity requires a new look at our way of allocating authorities and responsibilities. <P5> In what has come to be known as the continuum approach Maclean argued that archivists should base their profession upon studies of the characteristics of recorded information, recordkeeping systems, and classification (the way the records were ordered within recordkeeping systems and the way these were ordered through time). <P6> A significant role for today's archival institution is to help to identify and establish functional requirements for recordkeeping that enable a more systematic approach to authentication than that provided by physical custody. <warrant> <P7> In an electronic work environment it means, in part, that the objectivity, understandability, availability, and usability of records need to be inherent in the way that the record is captured. In turn the documents need to be captured in the context of the actions of which they are part, and are recursively involved. <warrant> <P8>A dimensional analysis can be constructed from the model and explained in a number of ways including a recordkeeping system reading. When the co-ordinates of the continuum model are connected, the different dimensions of a recordkeeping system are revealed. The dimensions are not boundaries, the co-ordinates are not invariably present, and things may happen simultaneously across dimensions, but no matter how a recordkeeping system is set up it can be analysed in terms such as: first dimensional analysis: a pre- communication system for document creation within electronic systems [creating the trace]; second dimensional analysis: a post- communication system, for example traditional registry functionality which includes registration, the value adding of data for linking documents and disseminating them, and the maintenance of the record including disposition data [capturing trace as record]; third dimensional analysis: a system involving building, recalling and disseminating corporate memory [organising the record as memory]; fourth dimensional analysis: a system for building, recalling and disseminating collective memory (social, cultural or historical) including information of the type required for an archival information system [pluralizing the memory]. <P9> In the high modern recordkeeping environment of the 1990's a continuum has to take into account a different array of recordkeeping tools. These tools, plucking a few out at random but ordering the list dimensionally, include: document management software, Australian records system software, the intranet and the internet. <P10> In terms of a records continuum which supports an evidence based recordkeeping approach, the second dimension is crucial. This is where the document is disembedded from the immediate contexts of the first dimension. It is this disembedding process that gives the record its value as a 'symbolic token'. A document is embedded in an act, but the document as a record needs to be validatable using external reference points. These points include the operation of the recordkeeping system into which it was received, and information pertaining to the technical, social (including business) and communication processes of which the document was part.
Conclusions
RQ "Postcustodial approaches to archives and records cannot be understood if they are treated as a dualism. They are not the opposite of custody. They are a response to opportunities for asserting the role of an archives - and not just its authentication role - in many re-invigorating ways, a theme which I will explore further in the next edition of Archives and Manuscripts."
SOW
DC "Frank Upward is a senior lecturer in the Department of Librarianship, Archives and Records at Monash University. He is an historian of the ideas contained in the Australian records continuum approach, and an ex-practitioner within that approach."
CA Strength of a metadata structure lies in its ability to layer and exchange information from a wide variety of creators in a "loosely coupled system of organization." This review covers metadata literature from mid 1996 through early 1998. It focuses on the development and application of metadata standards used by the LIS community for resource description, discovery and retrieval within a digital environment.
Type
Journal
Title
The role of standards in the archival management of electronic records
CA Technical standards, developed by national and international organizations, are increasingly important in electronic recordkeeping. Thirteen standards are summarized and their sponsoring organizations described.
Phrases
<P1> The challenge to archivists is to make sure that the standards being applied to electronic records systems today are adequate to ensure the long-term preservation and use of information contained in the systems. (p.31) <P2> While consensus can easily be established that data exchange standards offer a wealth of potential benefits, there are also a number of real barriers to implementation that make the road ahead for archivists a very bumpy one. (p.41)
Conclusions
RQ What the current state of standardization in the archival management of electronic records and what are the issues involved?
Type
Journal
Title
Archives and the information superhighway: Current status and future challenges
CA One struggle facing us is to convince the rest of society that the ÔÇ£information superhighwayÔÇØ is very much about records, evidence and ÔÇ£recordnessÔÇØ.
Phrases
<P1> It has been argued that existing computer software applications harm recordkeeping because they are remiss in capturing the full breadth of contextual information required to document transactions and create records -- records which can serve as reliable evidence of the transactions which created them. In place of records, these systems are producing data which fails to relate the who, what, when, where, and why of human communications -- attributes which are required for record evidence. This argument has found both saliency and support in other work conducted by the Netherlands and the World Bank, which have both noted that existing software applications fail to provide for the capture of the required complement of descriptive attributes required for proper recordkeeping. These examples point to the vast opportunity presented to archivists to position themselves as substantive contributors to information infrastructure discussions. Archivists are capable of pointing out what will be necessary to create records in the electronic environment which, in the words of David Bearman, meet the requirements of ÔÇ£business acceptable commincation. (p.87) <warrant>
Conclusions
RQ Can archivists provide access to information in the unstable electronic records environment we find ourselves in today?
Type
Journal
Title
Managing the Present: Metadata as Archival Description
Traditional archival description undertaken at the terminal stages of the life cycle has had two deleterious effects on the archival profession. First, it has resulted in enormous, and in some cases, insurmountable processing backlogs. Second, it has limited our ability to capture crucial contextual and structural information throughout the life cycle of record-keeping systems that are essential for fully understanding the fonds in our institutions. This shortcoming has resulted in an inadequate knowledge base for appraisal and access provision. Such complications will only become more magnified as distributed computering and complex software applications continue to expand throughout organizations. A metadata strategy for archival description will help mitigate these problems and enhance the organizational profile of archivists who will come to be seen as valuable organizational knowledge and accountability managers.
Critical Arguements
CA "This essay affirms this call for evaluation and asserts that the archival profession must embrace a metadata systems approach to archival description and management." ... "It is held here that the requirements for records capture and description are the requirements for metadata."
Phrases
<P1> New archival organizational structures must be created to ensure that records can be maintained in a usable form. <warrant> <P2> The recent report of Society of American Archivists (SAA) Committee on Automated Records and Techniques (CART) on curriculum development has argued that archivists need to "understand the nature and utility of metadata and how to interpret and use metadata for archival purposes." <warrant> <P3> The report advises archivists to acquire knowledge on the meanings of metadata, its structures, standards, and uses for the management of electronic records. Interestingly, the requirements for archival description immediately follow this section and note that archivists need to isolate the descriptive requirements, standards, documentiation, and practices needed for managing electronic records. <warrant> <P4> Clearly, archivists need to identify what types of metadata will best suit their descriptive needs, underscoring the need for the profession to develop strategies aand tactics to satisfy these requirements within active software environments. <warrant> <P5> Underlying the metadata systems strategy for describing and managing electronic information technologies is the seemingly universal agreement amongst electronic records archivists on the requirement to intervene earlier in the life cycle of electronic information systems. <warrant> <P6> Metadata has loomed over the archival management of electronic records for over five years now and is increasingly being promised as a basic control strategy for managing these records. <warrant> <P7> However, she [Margaret Hedstrom] also warns that as descriptive practices shift from creating descriptive information to capturing description along with the records, archivists may discover that managing the metadata is a much greater challenge than managing the records themselves. <P8> Archivists must seek to influence the creation of record-keeping systems within organizations by connecting the transaction that created the data to the data itself. Such a connection will link informational content, structure, and the context of transactions. Only when these conditions are met will we have records and an appropriate infrastructure for archival description. <warrant> <P9> Charles Dollar has argued that archivists increasingly will have to rely upon and shape the metadata associated with electronic records in order to fully capture provenance information about them. <warrant> <P10> Bearman proposes a metadata systems strategy, which would focus more explicitly on the context out of which records arise, as opposed to concentrating on their content. This axiom is premised on the assumption that "lifecycle records systems control should drive provenance-based description and link to top-down definitions of holdings." <warrant> <P11> Bearman and Margaret Hedstrom have built upon this model and contend that properly specified metadata capture could fully describe sytems while they are still active and eliminate the need for post-hoc description. The fundamental change wrought in this approach is the shift from doing things to records (surveying, scheduling, appraising, disposing/accessioning, describing, preserving, and accessing) to providing policy direction for adequate documentation through management of organizational behavior (analyzing organizational functions, defining business transactions, defining record metadata, indentifying control tactics, and establishing the record-keeping regime). Within this model archivists focus on steering how records will be captured (and that they will be captured) and how they will be managed and described within record-keeping systems while they are still actively serving their parent organization. <P12> Through the provision of policy guidance and oversight, organizational record-keeping is managed in order to ensure that the "documentation of organizational missions, functions, and responsibilities ... and reporting relationships within the organization, will be undertaken by the organizations themselves in their administrative control systems." <warrant> <P13> Through a metadata systems approach, archivists can realign themselves strategically as managers of authoritative information about organizational record-keeping systems, providing for the capture of information about each system, its contextual attributes, its users, its hardware configurations, its software configurations, and its data configurations. <warrant> <P14> The University of Pittsburgh's functional requirements for record-keeping provides a framework for such information management structure. These functional requirements are appropriately viewed as an absolute ideal, requiring testing within live systems and organizations. If properly implemented, however, they can provide a concrete model for metadata capture that can automatically supply many of the types of descriptive information both desired by archivists and required for elucidating the context out of which records arise. <P15> It is possible that satisfying these requirements will contribute to the development of a robust archival description process integrating "preservation of meaning, exercise of control, and provision of access'" within "one prinicipal, multipurpose descriptive instrument" hinted at by Luciana Duranti as a possible outcome of the electronic era. <P16> However, since electronic records are logical and not physical entities, there is no physical effort required to access and process them, just mental modelling. <P17> Depending on the type of metadata that is built into and linked to electronic information systems, it is possible that users can identify individual records at the lowest level of granularity and still see the top-level process it is related to. Furthermore, records can be reaggregated based upon user-defined criteria though metadata links that track every instance of their use, their relations to other records, and the actions that led to their creation. <P18> A metadata strategy for archival description will help to mitigate these problems and enhance the organizational profile of archivists, who will come to be seen as valuable organizational knowledge and accountability managers. <warrant>
Conclusions
RQ "First and foremost, the promise of metadata for archival description is contingent upon the creation of electronic record-keeping systems as opposed to a continuation of the data management orientation that seems to dominate most computer applications within organizations." ... "As with so many other aspects of the archival endeavour, these requirements and the larger metadata model for description that they are premised upon necessitate further exploration through basic research."
SOW
DC "In addition to New York State, recognition of the failure of existing software applications to capture a full compliment of metadata required for record-keeping and the need for such records management control has also been acknowledged in Canada, the Netherlands, and the World Bank." ... "In conjunction with experts in electronic records managment, an ongoing research project at the University of Pittsburgh has developed a set of thirteen functional requirements for record-keeping. These requirements provide a concrete metadata tool sought by archivists for managing and describing electronic records and electronic record-keeping systems." ... David A. Wallace is an Assistant Professor at the School of Information, University of Michigan, where he teaches in the areas of archives and records management. He holds a B.A. from Binghamton University, a Masters of Library Science from the University at Albany, and a doctorate from the University of Pittsburgh. Between 1988 and 1992, he served as Records/Systems/Database Manager at the National Security Archive in Washington, D.C., a non-profit research library of declassified U.S. government records. While at the NSA he also served as Technical Editor to their "The Making of U.S. Foreign Policy" series. From 1993-1994, he served as a research assistant to the University of Pittsburgh's project on Functional Requirements for Evidence in Recordkeeping, and as a Contributing Editor to Archives and Museum Informatics: Cultural Heritage Informatics Quarterly. From 1994 to 1996, he served as a staff member to the U.S. Advisory Council on the National Information Infrastructure. In 1997, he completed a dissertation analyzing the White House email "PROFS" case. Since arriving at the School of Information in late 1997, he has served as Co-PI on an NHPRC funded grant assessing strategies for preserving electronic records of collaborative processes, as PI on an NSF Digital Government Program funded planning grant investigating the incorporation of born digital records into a FOIA processing system, co-edited Archives and the Public Good: Accountability and Records in Modern Society (Quorum, 2002), and was awarded ARMA International's Britt Literary Award for an article on email policy. He also serves as a consultant to the South African History Archives Freedom of Information Program and is exploring the development of a massive digital library of declassified imaged/digitized U.S. government documents charting U.S. foreign policy.
Annual Review of Information Science and technology
Periodical Abbreviation
ARIST
Publication Year
2001
Volume
35
Pages
337
Critical Arguements
CA Yakel gives an overview of the literature on digital preservation from the early 1980s through 2000.
Phrases
<P1> The immediate question is whether the industry will adopt these standards, strategies, and functional elements to create evidence-based recordkeeping systems that ensure their authenticity and reliability. If the model of the DOD guidelines is any indication, some sectors of the vendor population will respond to recordkeeping specifications if there is sufficient customer leverage. Recordkeeping requirements rely on various metadata schemes and the viability of standards. (p.366) <warrant>
Conclusions
RQ More research into the hybrid approach (emulation and migration) is needed to determine criteria that will ensure the preservation of authenticity and reliability.
Type
Electronic Journal
Title
ARTISTE: An integrated Art Analysis and Navigation Environment
This article focuses on the description of the objectives of the ARTISTE project (for "An integrated Art Analysis and Navigation environment") that aims at building a tool for the intelligent retrieval and indexing of high resolution images. The ARTISTE project will address professional users in the fine arts as the primary end-user base. These users provide services for the ultimate end-user, the citizen.
Critical Arguements
CA "European museums and galleries are rich in cultural treasures but public access has not reached its full potential. Digital multimedia can address these issues and expand the accessible collections. However, there is a lack of systems and techniques to support both professional and citizen access to these collections."
Phrases
<P1> New technology is now being developed that will transform that situation. A European consortium, partly funded by the EU under the fifth R&D framework, is working to produce a new management system for visual information. <P2> Four major European galleries (The Uffizi in Florence, The National Gallery and the Victoria and Albert Museum in London and the Louvre related restoration centre, Centre de Recherche et de Restauration des Mus├®es de France) are involved in the project. They will be joining forces with NCR, a leading player in database and Data Warehouse technology; Interactive Labs, the new media design and development facility of Italy's leading art publishing group, Giunti; IT Innovation, Web-based system developers; and the Department of Electronics and Computer Science at the University of Southampton. Together they will create web based applications and tools for the automatic indexing and retrieval of high-resolution art images by pictorial content and information. <P3> The areas of innovation in this project are as follows: Using image content analysis to automatically extract metadata based on iconography, painting style etc; Use of high quality images (with data from several spectral bands and shadow data) for image content analysis of art; Use of distributed metadata using RDF to build on existing standards; Content-based navigation for art documents separating links from content and applying links according to context at presentation time; Distributed linking and searching across multiple archives allowing ownership of data to be retained; Storage of art images using large (>1TeraByte) multimedia object relational databases. <P4> The ARTISTE approach will use the power of object-related databases and content-retrieval to enable indexing to be made dynamically, by non-experts. <P5> In other words ARTISTE would aim to give searchers tools which hint at links due to say colour or brush-stroke texture rather than saying "this is the automatically classified data". <P6> The ARTISTE project will build on and exploit the indexing scheme proposed by the AQUARELLE consortia. The ARTISTE project solution will have a core component that is compatible with existing standards such as Z39.50. The solution will make use of emerging technical standards XML, RDF and X-Link to extend existing library standards to a more dynamic and flexible metadata system. The ARTISTE project will actively track and make use of existing terminology resources such as the Getty "Art and Architecture Thesaurus" (AAT) and the "Union List of Artist Names" (ULAN). <P7> Metadata will also be stored in a database. This may be stored in the same object-relational database, or in a separate database, according to the incumbent systems at the user partners. <P8> RDF provides for metadata definition through the use of schemas. Schemas define the relevant metadata terms (the namespace) and the associated semantics. Individual RDF queries and statements may use multiple schemas. The system will make use of existing schemas such as the Dublin Core schema and will provide wrappers for existing resources such as the Art and Architecture thesaurus in a RDF schema wrapper. <P9> The Distributed Query and Metadata Layer will also provide facilities to enable queries to be directed towards multiple distributed databases. The end user will be able to seamlessly search the combined art collection. This layer will adhere to worldwide digital library standards such as Z39.50, augmenting and extending as necessary to allow the richness of metadata enabled by the RDF standard.
Conclusions
RQ "In conclusion the Artiste project will result into an interesting and innovative system for the art analysis, indexing storage and navigation. The actual state of the art of content-based retrieval systems will be positively influenced by the development of the Artiste project, which will pursue the following goals: A solution which can be replicated to European galleries, museums, etc.; Deep-content analysis software based on object relational database technology.; Distributed links server software, user interfaces, and content-based navigation software.; A fully integrated prototype analysis environment.; Recommendations for the exploitation of the project solution by European museums and galleries. ; Recommendations for the exploitation of the technology in other sectors.; "Impact on standards" report detailing augmentations of Z39.50 with RDF." ... ""Not much research has been carried out worldwide on new algorithms for style-matching in art. This is probably not a major aim in Artiste but could be a spin-off if the algorithms made for specific author search requirements happen to provide data which can be combined with other data to help classify styles." >
SOW
DC "Four major European galleries (The Uffizi in Florence, The National Gallery and the Victoria and Albert Museum in London and the Louvre related restoration centre, Centre de Recherche et de Restauration des Mus├®es de France) are involved in the project. They will be joining forces with NCR, a leading player in database and Data Warehouse technology; Interactive Labs, the new media design and development facility of Italy's leading art publishing group, Giunti; IT Innovation, Web-based system developers; and the Department of Electronics and Computer Science at the University of Southampton. Together they will create web based applications and tools for the automatic indexing and retrieval of high-resolution art images by pictorial content and information."
Type
Electronic Journal
Title
Keeping Memory Alive: Practices for Preserving Digital Content at the National Digital Library Program of the Library of Congress
CA An overview of the major issues and initiatives in digital preservation at the Library of Congress. "In the medium term, the National Digital Library Program is focusing on two operational approaches. First, steps are taken during conversion that are likely to make migration or emulation less costly when they are needed. Second, the bit streams generated by the conversion process are kept alive through replication and routine refreshing supported by integrity checks. The practices described here provide examples of how those steps are implemented to keep the content of American Memory alive."
Phrases
<P1> The practices described here should not be seen as policies of the Library of Congress; nor are they suggested as best practices in any absolute sense. NDLP regards them as appropriate practices based on real experience, the nature and content of the originals, the primary purposes of the digitization, the state of technology, the availability of resources, the scale of the American Memory digital collection, and the goals of the program. They cover not just the storage of content and associated metadata, but also aspects of initial capture and quality review that support the long-term retention of content digitized from analog sources. <P2> The Library recognizes that digital information resources, whether born digital or converted from analog forms, should be acquired, used, and served alongside traditional resources in the same format or subject area. Such responsibility will include ensuring that effective access is maintained to the digital content through American Memory and via the Library's main catalog and, in coordination with the units responsible for the technical infrastructure, planning migration to new technology when needed. <P3> Refreshing can be carried out in a largely automated fashion on an ongoing basis. Migration, however, will require substantial resources, in a combination of processing time, out-sourced contracts, and staff time. Choice of appropriate formats for digital masters will defer the need for large-scale migration. Integrity checks and appropriate capture of metadata during the initial capture and production process will reduce the resource requirements for future migration steps. <warrant> We can be certain that migration of content to new data formats will be necessary at some point. The future will see industrywide adoption of new data formats with functional advantages over current standards. However, it will be difficult to predict exactly which metadata will be useful to support migration, when migration of master formats will be needed, and the nature and extent of resource needs. Human experts will need to decide when to undertake migration and develop tools for each migration step. <P4> Effective preservation of resources in digital form requires (a) attention early in the life-cycle, at the moment of creation, publication, or acquisition and (b) ongoing management (with attendant costs) to ensure continuing usability. <P5> The National Digital Library Program has identified several categories of metadata needed to support access and management for digital content. Descriptive metadata supports discovery through search and browse functions. Structural metadata supports presentation of complex objects by representing relationships between components, such as sequences of images. In addition, administrative metadata is needed to support management tasks, such as access control, archiving, and migration. Individual metadata elements may support more than one function, but the categorization of elements by function has proved useful. <P6> It has been recognized that metadata representations appropriate for manipulation and long-term retention may not always be appropriate for real-time delivery. <P7> It has also been realized that some basic descriptive metadata (at the very least a title or brief description) should be associated with the structural and administrative metadata. <P8> During 1999, an internal working group reviewed past experience and prototype exercises and compiled a core set of metadata elements that will serve the different functions identified. This set will be tested and refined as part of pilot activities during 2000. <P9> Master formats are well documented and widely deployed, preferably formal standards and preferably non-proprietary. Such choices should minimize the need for future migration or ensure that appropriate and affordable tools for migration will be developed by the industry. <warrant>
Conclusions
RQ "Developing long-term strategies for preserving digital resources presents challenges associated with the uncertainties of technological change. There is currently little experience on which to base predictions of how often migration to new formats will be necessary or desirable or whether emulation will prove cost-effective for certain categories of resources. ... Technological advances, while sure to present new challenges, will also provide new solutions for preserving digital content."
Type
Electronic Journal
Title
A Spectrum of Interoperability: The Site for Science Prototype for the NSDL
"Currently, NSF is funding 64 projects, each making its own contribution to the library, with a total annual budget of about $24 million. Many projects are building collections; others are developing services; a few are carrying out targeted research.The NSDL is a broad program to build a digital library for education in science, mathematics, engineering and technology. It is funded by the National Science Foundation (NSF) Division of Undergraduate Education. . . . The Core Integration task is to ensure that the NSDL is a single coherent library, not simply a set of unrelated activities. In summer 2000, the NSF funded six Core Integration demonstration projects, each lasting a year. One of these grants was to Cornell University and our demonstration is known as Site for Science. It is at http://www.siteforscience.org/ [Site for Science]. In late 2001, the NSF consolidated the Core Integration funding into a single grant for the production release of the NSDL. This grant was made to a collaboration of the University Corporation for Atmospheric Research (UCAR), Columbia University and Cornell University. The technical approach being followed is based heavily on our experience with Site for Science. Therefore this article is both a description of the strategy for interoperability that was developed for Site for Science and an introduction to the architecture being used by the NSDL production team."
ISBN
1082-9873
Critical Arguements
CA "[T]his article is both a description of the strategy for interoperability that was developed for the [Cornell University's NSF-funded] Site for Science and an introduction to the architecture being used by the NSDL production team."
Phrases
<P1> The grand vision is that the NSDL become a comprehensive library of every digital resource that could conceivably be of value to any aspect of education in any branch of science and engineering, both defined very broadly. <P2> Interoperability among heterogeneous collections is a central theme of the Core Integration. The potential collections have a wide variety of data types, metadata standards, protocols, authentication schemes, and business models. <P3> The goal of interoperability is to build coherent services for users, from components that are technically different and managed by different organizations. This requires agreements to cooperate at three levels: technical, content and organizational. <P4> Much of the research of the authors of this paper aims at . . . looking for approaches to interoperability that have low cost of adoption, yet provide substantial functionality. One of these approaches is the metadata harvesting protocol of the Open Archives Initiative (OAI) . . . <P5> For Site for Science, we identified three levels of digital library interoperability: Federation; Harvesting; Gathering. In this list, the top level provides the strongest form of interoperability, but places the greatest burden on participants. The bottom level requires essentially no effort by the participants, but provides a poorer level of interoperability. The Site for Science demonstration concentrated on the harvesting and gathering, because other projects were exploring federation. <P6> In an ideal world all the collections and services that the NSDL wishes to encompass would support an agreed set of standard metadata. The real world is less simple. . . . However, the NSDL does have influence. We can attempt to persuade collections to move along the interoperability curve. <warrant> <P7> The Site for Science metadata strategy is based on two principles. The first is that metadata is too expensive for the Core Integration team to create much of it. Hence, the NSDL has to rely on existing metadata or metadata that can be generated automatically. The second is to make use of as much of the metadata available from collections as possible, knowing that it varies greatly from none to extensive. Based on these principles, Site for Science, and subsequently the entire NSDL, developed the following metadata strategy: Support eight standard formats; Collect all existing metadata in these formats; Provide crosswalks to Dublin Core; Assemble all metadata in a central metadata repository; Expose all metadata records in the repository for service providers to harvest; Concentrate limited human effort on collection-level metadata; Use automatic generation to augment item-level metadata. <P8> The strategy developed by Site for Science and now adopted by the NSDL is to accumulate metadata in the native formats provided by the collections . . . If a collection supports the protocols of the Open Archives Initiative, it must be able to supply unqualified Dublin Core (which is required by the OAI) as well as the native metadata format. <P9> From a computing viewpoint, the metadata repository is the key component of the Site for Science system. The repository can be thought of as a modern variant of the traditional library union catalog, a catalog that holds comprehensive catalog records from a group of libraries. . . . Metadata from all the collections is stored in the repository and made available to providers of NSDL service.
Conclusions
RQ 1 "Can a small team of librarians manage the collection development and metadata strategies for a very large library?" RQ 2 "Can the NSDL actually build services that are significantly more useful than the general web search services?"
Type
Electronic Journal
Title
Electronic Records Research: Working Meeting May 28-30, 1997
CA Archivists are specifically concerned with records that are not easy to document -- records that are full of secret, proprietary or sensitive information, not to mention hardware and software dependencies. This front end of recordmaking and keeping must be addressed as we define what electronic records are and are not, and how we are to deal with them.
Phrases
<P1> Driven by pragmatism, the University of Pittsburgh team looked for "warrant" in the sources considered authoritative by the practicioners of ancillary professions on whom archivists rely -- lawyers, auditors, IT personnel , etc. (p.3) <P2> If the record creating event and the requirements of 'recordness' are both known, focus shifts to capturing the metadata and binding it to the record contents. (p.7) <P3> A strong business case is still needed to justify the role of archivists in the creation of electronic record management systems. (p.10)
Conclusions
RQ Warrant needs to be looked at in different countries. Does the same core definition of what constitutes a record cut across state borders? What role do specific user needs play in complying to regulation and risk management?
CA Through OAI, access to resources is effected in a low-cost, interoperable manner.
Phrases
<P1> The need for a metadata format that would support both metadata creation by authors and interoperability across heterogeneous repositories led to the choice of unqualified Dublin Core. (p.16) <P2> OAI develops and promotes a low-barrier interoperability framework and associated standards, originally to enhance access to e-print archives, but now taking into account access to other digital materials. (p.16)
Conclusions
RQ The many players involved in cultural heritage need to work together to define standards and best practices.
Type
Electronic Journal
Title
Metadata Corner: Working Meeting on Electronic Records Research
CA Just as the digital library has forced librarians to rethink their profession, so has the electronic record done the same for archivists and recordkeepers. Much of the debate centers around what e-records are and how to deal with them.
Phrases
<P1> Their presentations at the Working Meeting elaborated on the concept of "literary warrant," which can be defined as the mandate from outside the archives profession -- from law, professional best practices and other special sources -- which requires the creation and maintenance of records. (p.1) <P2> Systems that records professionals devise to maintain these concepts over time, may, therefore, be of interest to information professionals interested in digital preservation. Authenticity, indeed, is a key component of Peter Graham's description of "intellectual preservation." (p.3)
Conclusions
RQ How can metadata be linked to its record over time? How can we ensure the "least-loss" migration of metadata over time? Collaboration with warrant creators from other fields such as lawyers and auditors is desirable.
CA Metadata is a key part of the information infrastructure necessary to organize and classify the massive amount of information on the Web. Metadata, just like the resources they describe, will range in quality and be organized around different principles. Modularity is critical to allow metadata schema designers to base their new creations on established schemas, thereby benefiting from best practices rather than reinventing elements each time. Extensibility and cost-effectiveness are also important factors. Controlled vocabularies provide greater precision and access. Multilingualism (translating specification documents into many languages) is an important step in fostering global metadata architecture(s).
Phrases
<P1> The use of controlled vocabularies is another important approach to refinement that improves the precision for descriptions and leverages the substantial intellectual investment made by many domains to improve subject access. (p.4) <P2> Standards typically deal with these issues through the complementary processes of internalization and localization: the former process relates to the creation of "neutral" standards, whereas the latter refers to the adaptation of such a neutral standard to a local context. (p.4)
Conclusions
RQ In order for the full potential of resource discovery that the Web could offer to be realized, a"convergence" of standards and semantics must occur.
Type
Electronic Journal
Title
Directive 1999/93/EC of the European Parliament and of the Council of 13 December 1999 on a Community Framework for Electronic Signatures
CA "[A] clear Community framework regarding the conditions applying to electronic signatures will strengthen confidence in, and general acceptance of, the new technologies; legislation in the Member States should not hinder the free movement of goods and services in the internal market. ... The interoperability of electronic-signature products should be promoted. ... Rapid technological development and the global character of the Internet necessitate an approach which is open to various technologies and services capable of authenticating data electronically. ... This Directive contributes to the use and legal recognition of electronic signatures within the Community; a regulatory framework is not needed for electronic signatures exclusively used within systems, which are based on voluntary agreements under private law between a specified number of participants; the freedom of parties to agree among themselves the terms and conditions under which they accept electronically signed data should be respected to the extent allowed by national law; the legal effectiveness of electronic signatures used in such systems and their admissibility as evidence in legal proceedings should be recognised. ... The storage and copying of signature-creation data could cause a threat to the legal validity of electronic signatures. ... Harmonised criteria relating to the legal effects of electronic signatures will preserve a coherent legal framework across the Community; national law lays down different requirements for the legal validity of hand-written signatures; whereas certificates can be used to confirm the identity of a person signing electronically; advanced electronic signatures based on qualified certificates aim at a higher level of security; advanced electronic signatures which are based on a qualified certificate and which are created by a secure-signature-creation device can be regarded as legally equivalent to hand-written signatures only if the requirements for hand-written signatures are fulfilled. ... In order to contribute to the general acceptance of electronic authentication methods it has to be ensured that electronic signatures can be used as evidence in legal proceedings in all Member States; the legal recognition of electronic signatures should be based upon objective criteria and not be linked to authorisation of the certification-service-provider involved; national law governs the legal spheres in which electronic documents and electronic signatures may be used; this Directive is without prejudice to the power of a national court to make a ruling regarding conformity with the requirements of this Directive and does not affect national rules regarding the unfettered judicial consideration of evidence. ... In order to increase user confidence in electronic communication and electronic commerce, certification-service-providers must observe data protection legislation and individual privacy. ... Provisions on the use of pseudonyms in certificates should not prevent Member States from requiring identification of persons pursuant to Community or national law."
Phrases
<P1> Legal effects of electronic signatures: (1) Member States shall ensure that advanced electronic signatures which are based on a qualified certificate and which are created by a secure-signature-creation device: (a) satisfy the legal requirements of a signature in relation to data in electronic form in the same manner as a handwritten signature satisfies those requirements in relation to paper-based data; and (b) are admissible as evidence in legal proceedings.(2) Member States shall ensure that an electronic signature is not denied legal effectiveness and admissibility as evidence in legal proceedings solely on the grounds that it is: in electronic form, or not based upon a qualified certificate, or not based upon a qualified certificate issued by an accredited certification-service-provider, or not created by a secure signature-creation device. (Art. 5) <P2> Member States shall ensure that a certification-service-provider which issues certificates to the public may collect personal data only directly from the data subject, or after the explicit consent of the data subject, and only insofar as it is necessary for the purposes of issuing and maintaining the certificate. The data may not be collected or processed for any other purposes without the explicit consent of the data subject. (Art. 8) <P3> Requirements for qualified certificates: Qualified certificates must contain:(a) an indication that the certificate is issued as a qualified certificate; (b) the identification of the certification-service-provider and the State in which it is established; (c) the name of the signatory or a pseudonym, which shall be identified as such; (d) provision for a specific attribute of the signatory to be included if relevant, depending on the purpose for which the certificate is intended; (e) signature-verification data which correspond to signature-creation data under the control of the signatory; (f) an indication of the beginning and end of the period of validity of the certificate; (g) the identity code of the certificate; (h) the advanced electronic signature of the certification-service-provider issuing it; (i) limitations on the scope of use of the certificate, if applicable; and (j) limits on the value of transactions for which the certificate can be used, if applicable. (Annex I) <P4> Requirements for certification-service-providers issuing qualified certificates: Certification-service-providers must: (a) demonstrate the reliability necessary for providing certification services; (b) ensure the operation of a prompt and secure directory and a secure and immediate revocation service; (c) ensure that the date and time when a certificate is issued or revoked can be determined precisely; (d) verify, by appropriate means in accordance with national law, the identity and, if applicable, any specific attributes of the person to which a qualified certificate is issued; (e) employ personnel who possess the expert knowledge, experience, and qualifications necessary for the services provided, in particular competence at managerial level, expertise in electronic signature technology and familiarity with proper security procedures; they must also apply administrative and management procedures which are adequate and correspond to recognised standards; (f) use trustworthy systems and products which are protected against modification and ensure the technical and cryptographic security of the process supported by them; (g) take measures against forgery of certificates, and, in cases where the certification-service-provider generates signature-creation data, guarantee confidentiality during the process of generating such data; (h) maintain sufficient financial resources to operate in conformity with the requirements laid down in the Directive, in particular to bear the risk of liability for damages, for example, by obtaining appropriate insurance; (i) record all relevant information concerning a qualified certificate for an appropriate period of time, in particular for the purpose of providing evidence of certification for the purposes of legal proceedings. Such recording may be done electronically; (j) not store or copy signature-creation data of the person to whom the certification-service-provider provided key management services; (k) before entering into a contractual relationship with a person seeking a certificate to support his electronic signature inform that person by a durable means of communication of the precise terms and conditions regarding the use of the certificate, including any limitations on its use, the existence of a voluntary accreditation scheme and procedures for complaints and dispute settlement. Such information, which may be transmitted electronically, must be in writing and in readily understandable language. Relevant parts of this information must also be made available on request to third-parties relying on the certificate; (l) use trustworthy systems to store certificates in a verifiable form so that: only authorised persons can make entries and changes, information can be checked for authenticity, certificates are publicly available for retrieval in only those cases for which the certificate-holder's consent has been obtained, and any technical changes compromising these security requirements are apparent to the operator. (Annex II) <P5> Requirements for secure signature-creation devices: 1. Secure signature-creation devices must, by appropriate technical and procedural means, ensure at the least that: (a) the signature-creation-data used for signature generation can practically occur only once, and that their secrecy is reasonably assured; (b) the signature-creation-data used for signature generation cannot, with reasonable assurance, be derived and the signature is protected against forgery using currently available technology; (c) the signature-creation-data used for signature generation can be reliably protected by the legitimate signatory against the use of others. (2) Secure signature-creation devices must not alter the data to be signed or prevent such data from being presented to the signatory prior to the signature process. (Annex III) <P6> Recommendations for secure signature verification: During the signature-verification process it should be ensured with reasonable certainty that: (a) the data used for verifying the signature correspond to the data displayed to the verifier; (b) the signature is reliably verified and the result of that verification is correctly displayed; (c) the verifier can, as necessary, reliably establish the contents of the signed data; (d) the authenticity and validity of the certificate required at the time of signature verification are reliably verified; (e) the result of verification and the signatory's identity are correctly displayed; (f) the use of a pseudonym is clearly indicated; and (g) any security-relevant changes can be detected. (Annex IV)
Type
Electronic Journal
Title
The National Digital Information Infrastructure Preservation Program: Expectations, Realities, Choices and Progress to Date
CA The goals of this plan include the continued collecting of materials regardless of evolving digital formats, the long-term preservation of said materials and ensuring access to them for the American people.
Phrases
<P1> There is widespread support for a national initiative in long term preservation of digital content across a very broad range of stakeholder groups outside the traditional scholarly community (p.4) <warrant> <P2> Approaching persistent archiving from the perspective of infrastructure allows system designers to decouple the data storage from the various components that allow users to manage the data. Potentially, any component can be "swapped out" without affecting the rest of the system. Theoretically, many of the technical problems in archiving can be separated into their components, and as innovation occurs, those components can be updated so that the archive remains persistent in the context of rapid change. Similarly, as the storage media obsolesce, the data can be migrated without affecting the overall integrity of the system. (p.7)
Conclusions
RQ Scenario-planning exercises may help expose assumptions that could ultimately be limiting in the future.
Type
Electronic Journal
Title
Coming to TERM: Designing the Texas Repository E-mail
CA The sheer volume of e-mail makes it difficult to preserve. Because of recent problems in the management of government e-mail, the Texas Department of Information Resources has been exploring ways to centralize and store all Texas state government e-mail.
Phrases
<P1> The Open Archival Information System (OAIS) Reference Model has attracted wide attention as a workable model because it provides the elements that research indicates are necessary: a closely audited, well documented, and constantly maintained and updated system. These elements are especially attractive to government. This model has the advantage of being an ISO international standard. <P2> This preservation strategy respects the archival bond by providing evidence of the official business transactions of the state in an architecture that respects and maintains links and ties between the compound parts of the record as well as larger logical groupings of records. <P3> Hundreds of archival institutions exist in North America, and yet very few, public or private, are currently preserving electronic records for the long-term.
Conclusions
RQ How does one construct and manage a large repository of electronic data?
The Semantic Web activity is a W3C project whose goal is to enable a 'cooperative' Web where machines and humans can exchange electronic content that has clear-cut, unambiguous meaning. This vision is based on the automated sharing of metadata terms across Web applications. The declaration of schemas in metadata registries advance this vision by providing a common approach for the discovery, understanding, and exchange of semantics. However, many of the issues regarding registries are not clear, and ideas vary regarding their scope and purpose. Additionally, registry issues are often difficult to describe and comprehend without a working example.
ISBN
1082-9873
Critical Arguements
CA "This article will explore the role of metadata registries and will describe three prototypes, written by the Dublin Core Metadata Initiative. The article will outline how the prototypes are being used to demonstrate and evaluate application scope, functional requirements, and technology solutions for metadata registries."
Phrases
<P1> Establishing a common approach for the exchange and re-use of data across the Web would be a major step towards achieving the vision of the Semantic Web. <warrant> <P2> The Semantic Web Activity statement articulates this vision as: 'having data on the Web defined and linked in a way that it can be used for more effective discovery, automation, integration, and reuse across various applications. The Web can reach its full potential if it becomes a place where data can be shared and processed by automated tools as well as by people.' <P3> In parallel with the growth of content on the Web, there have been increases in the amount and variety of metadata to manipulate this content. An inordinate amount of standards-making activity focuses on metadata schemas (also referred to as vocabularies or data element sets), and yet significant differences in schemas remain. <P4> Different domains typically require differentiation in the complexity and semantics of the schemas they use. Indeed, individual implementations often specify local usage, thereby introducing local terms to metadata schemas specified by standards-making bodies. Such differentiation undermines interoperability between systems. <P5> This situation highlights a growing need for access by users to in-depth information about metadata schemas and particular extensions or variations to schemas. Currently, these 'users' are human  people requesting information. <warrant> <P6> It would be helpful to make available easy access to schemas already in use to provide both humans and software with comprehensive, accurate and authoritative information. <warrant> <P7> The W3C Resource Description Framework (RDF) has provided the basis for a common approach to declaring schemas in use. At present the RDF Schema (RDFS) specification offers the basis for a simple declaration of schema. <P8> Even as it stands, an increasing number of initiatives are using RDFS to 'publish' their schemas. <P9> Registries provide 'added value' to users by indexing schemas relevant to a particular 'domain' or 'community of use' and by simplifying the navigation of terms by enabling multiple schemas to be accessed from one view. <warrant> <P10> Additionally, the establishment of registries to index terms actively being used in local implementations facilitates the metadata standards activity by providing implementation experience transferable to the standards-making process. <warrant> <P11> The overriding goal has been the development of a generic registry tool useful for registry applications in general, not just useful for the DCMI. <P12> The formulation of a 'definitive' set of RDF schemas within the DCMI that can serve as the recommended, comprehensive and accurate expression of the DCMI vocabulary has hindered the development of the DCMI registry. To some extent, this has been due to the changing nature of the RDF Schema specification and its W3C candidate recommendation status. However, it should be recognized that the lack of consensus within the DCMI community regarding the RDF schemas has proven to be equally as impeding. <P13> The automated sharing of metadata across applications is an important part of realizing the goal of the Semantic Web. Users and applications need practical solutions for discovering and sharing semantics. Schema registries provide a viable means of achieving this. <warrant>
Conclusions
RQ "Many of the issues regarding metadata registries are unclear and ideas regarding their scope and purpose vary. Additionally, registry issues are often difficult to describe and comprehend without a working example. The DCMI makes use of rapid prototyping to help solve these problems. Prototyping is a process of quickly developing sample applications that can then be used to demonstrate and evaluate functionality and technology."
SOW
DC "New impetus for the development of registries has come with the development activities surrounding creation of the Semantic Web. The motivation for establishing registries arises from domain and standardization communities, and from the knowledge management community." ... "The original charter for the DCMI Registry Working Group was to establish a metadata registry to support the activity of the DCMI. The aim was to enable the registration, discovery, and navigation of semantics defined by the DCMI, in order to provide an authoritative source of information regarding the DCMI vocabulary. Emphasis was placed on promoting the use of the Dublin Core and supporting the management of change and evolution of the DCMI vocabulary." ... "Discussions within the DCMI Registry Working Group (held primarily on the group's mailing list) have produced draft documents regarding application scope and functionality. These discussions and draft documents have been the basis for the development of registry prototypes and continue to play a central role in the iterative process of prototyping and feedback." ... The overall goal of the DCMI Registry Working Group (WG) is to provide a focus for continued development of the DCMI Metadata Registry. The WG will provide a forum for discussing registry-related activities and facilitating cooperation with the ISO 11179 community, the Semantic Web, and other related initiatives on issues of common interest and relevance.
Type
Electronic Journal
Title
Primary Sources, Research, and the Internet: The Digital Scriptorium at Duke
First Monday, Peer Reviewed Journal on the Internet
Publication Year
1997
Volume
2
Issue
9
Critical Arguements
CA "As the digital revolution moves us ever closer to the idea of the 'virtual library,' repositories of primary sources and other archival materials have both a special opportunity and responsibility. Since the materials in their custody are, by definition, often unique, these institutions will need to work very carefully with scholars and other researchers to determine what is the most effective way of making this material accessible in a digital environment."
Phrases
<P1> The matter of Internet access to research materials and collections is not one of simply doing what we have always done -- except digitally. It represents instead an opportunity to rethink the fundamental triangular relationship between libraries and archives, their collections, and their users. <P2> Digital information as it exists on the Internet today requires more navigational, contextual, and descriptive data than is currently provided in traditional card catalogs or their more modern electronic equivalent. One simply cannot throw up vast amounts of textual or image-based data onto the World Wide Web and expect existing search engines to make much sense of it or users to be able to digest the results. ... Archivists and manuscript curators have for many years now been providing just that sort of contextual detail in the guides, finding aids, and indexes that they have traditionally prepared for their holdings. <P3> Those involved in the Berkeley project understood that HTML was essentially a presentational encoding scheme and lacked the formal structural and content-based encoding that SGML would offer. <P4> Encoded Archival Description is quickly moving towards become an internationally embraced standard for the encoding of archival metadata in a wide variety of archival repositories and special collections libraries. And the Digital Scriptorium at Duke has become one of the early implementors of this standard. <warrant>
Conclusions
RQ "Duke is currently involved in a project that is funded through NEH and also involves the libraries of Stanford, the University of Virginia, and the University of California-Berkeley. This project (dubbed the "American Heritage Virtual Digital Archives Project") will create a virtual archive of encoded finding aids from all four institutions. This archive will permit seamless searching of these finding aids -- at a highly granular level of detail -- through a single search engine on one site and will, it is hoped, provide a model for a more comprehensive national system in the near future."
Type
Electronic Journal
Title
Review: Some Comments on Preservation Metadata and the OAIS Model
CA Criticizes some of the limitations of OAIS and makes suggestions for improvements and clarifications. Also suggests that OAIS may be too library-centric, to the determinent of archival and especially recordkeeping needs. "In this article I have tried to articulate some of the main requirements for the records and archival community in preserving (archival) records. Based on this, the conclusion has to be that some adaptations to the [OAIS] model and metadata set would be necessary to meet these requirements. This concerns requirements such as the concept of authenticity of records, information on the business context of records and on relationships between records ('documentary context')."(p. 20)
Phrases
<P1> It requires records managers and archivists (and perhaps other information professionals) to be aware of these differences [in terminology] and to make a translation of such terms to their own domain. (p. 15) <P2> When applying the metadata model for a wider audience, more awareness of the issue of terminology is required, for instance by including clear definitions of key terms. (p. 15) <P3> The extent to which the management of objects can be influenced differs with respect to the type of objects. In the case of (government) records, legislation governs their creation and management, whereas, in the case of publications, the influence will be mostly based on agreements between producers, publishers and preservers. (p. 16) <P4> [A]lthough the suggestion may sometimes be otherwise, preservation metadata do not only apply to what is under the custody of a cultural or other preserving institution, but should be applied to the whole lifecycle of digital objects. ... Preservation can be viewed as part of maintenance. <warrant> (p. 16) <P5> [B]y taking library community needs as leading (albeit implicitly), the approach is already restricting the types of digital objects. Managing different types of 'digital objects', e.g. publications and records, may require not entirely similar sets of metadata. (p. 16) <P6> Another issue is that of the requirements governing the preservation processes. ... There needs to be insight and, as a consequence, also metadata about the preservation strategies, policies and methods, together with the context in which the preservation takes place. <warrant> (p. 16) <P7> [W]hat do we want to preserve? Is it the intellectual content with the functionality it has to have in order to make sense and achieve its purpose, or is it the digital components that are necessary to reproduce it or both? (p. 16-17) <P8> My view is that 'digital objects' should be seen as objects having both conceptual and technical aspects that are closely interrelated. As a consequence of the explanation given above, a digital object may consist of more than one 'digital component'. The definition given in the OAIS model is therefore insufficient. (p. 17) <P9> [W]e have no fewer than five metadata elements that could contain information on what should be rendered and presented on the screen. How all these elements relate to each other, if at all, is unclear. (p. 17) <P10> What we want to achieve ... is that in the future we will still be able to see, read and understand the documents or other information entities that were once produced for a certain purpose and in a certain context. In trying to achieve this, we of course need to preserve these digital components, but, as information technology will evolve, these components have to be migrated or in some cases emulated to be usable on future hard- and software platforms. (p. 17) <P11> I would like to suggest including an element that reflects the original technical environment. (p. 18) <P12> Records, according to the recently published ISO records management standard 15489, are 'information created, received and maintained as evidence and information by an organisation or person, in pursuance of legal obligations or in the transaction of business'. ... The main requirements for records to serve as evidence or authoritative information sources are ... authenticity and integrity, and knowledge about the business context and about the interrelationship between records (e.g. in a case file). <warrant> (p. 18) <P13> It would have been helpful if there had been more acknowledgement of the issue of authenticity and the requirements for it, and if the Working Group had provided some background information about its view and considerations on this aspect and to what extent it is included or not. (p. 19) <P14> In order to be able to preserve (archival) records it will ... be necessary to extend the information model with another class of information that refers to business context. Such a subset could provide a structure for describing what in archival terminology is called information about 'provenance' (with a different meaning from that in OAIS). (p. 19) <P15> In order to accommodate the identified complexity it is necessary to distinguish at least between the following categories of relationships: relationships between intellectual objects ... in the archival context this is referred to as 'documentary context'; relationships between the (structural) components of one intellectual object ... ; [and] relationships between digital components. (p. 19-20) <P16> [T]he issue of appraisal and disposition of records has to be included. In this context the recently published records management standard (ISO 15489) may serve as a useful framework. It would make the OAIS model even more widely applicable. (p. 20)
Conclusions
RQ "There are some issues ... which need further attention. They concern on the one hand the scope and underlying concepts of the OAIS model and the resulting metadata set as presented, and on the other hand the application of the model and metadata set in a records and archival environment. ... [T]he distinction between physical and conceptual or intellectual aspects of a digital object should be made more explicit and will probably have an impact on the model and metadata set also. More attention also needs to be given to the relationship between the (preservation) processes and the metadata. ... In assessing the needs of the records and archival community, the ISO records management standard 15489 may serve as a very useful framework. Such an exercise would also include a test for applicability of the model and metadata set for record-creating organisations and, as such, broaden the view of the OAIS model." (p. 20)
SOW
DC OAIS emerged out of an initiative spearheaded by NASA's Consultative Committee for Space Data Systems. It has been shaped and promoted by the RLG and OCLC. Several international projects have played key roles in shaping the OAIS model and adapting it for use in libraries, archives and research repositories. OAIS-modeled repositories include the CEDARS Project, Harvard's Digital Repository, Koninklijke Bibliotheek (KB), the Library of Congress' Archival Information Package for audiovisual materials, MIT's D-Space, OCLC's Digital Archive and TERM: the Texas Email Repository Model.
Type
Electronic Journal
Title
Computer Records and the Federal Rules of Evidence
See also U.S. Federal Rules of Evidence. Rule 803. Hearsay Exceptions; Availability of Declarant Immaterial.
Publisher
U.S. Department of Justice Executive Office for United States Attorneys
Critical Arguements
CA "This article explains some of the important issues that can arise when the government seeks the admission of computer records under the Federal Rules of Evidence. It is an excerpt of a larger DOJ manual entitled 'Searching and Seizing Computers and Obtaining Electronic Evidence in Criminal Investigations,' which is available on the internet at www.cybercrime.gov/searchmanual.htm." Cites cases dealing with Fed. R. Evid. 803(6).
Phrases
<P1>Most federal courts that have evaluated the admissibility of computer records have focused on computer records as potential hearsay. The courts generally have admitted computer records upon a showing that the records fall within the business records exception, Fed. R. Evid. 803(6). <P2> See, e.g., United States v. Cestnik, 36 F.3d 904, 909-10 (10th Cir. 1994); United States v. Moore, 923 F.2d 910, 914 (1st Cir. 1991); United States v. Briscoe, 896 F.2d 1476, 1494 (7th Cir. 1990); United States v. Catabran, 836 F.2d 453, 457 (9th Cir. 1988); Capital Marine Supply v. M/V Roland Thomas II, 719 F.2d 104, 106 (5th Cir. 1983). <P3> Applying this test, the courts have indicated that computer records generally can be admitted as business records if they were kept pursuant to a routine procedure for motives that tend to assure their accuracy. <warrant>
Conclusions
RQ "The federal courts are likely to move away from this 'one size fits all' approach as they become more comfortable and familiar with computer records. Like paper records, computer records are not monolithic: the evidentiary issues raised by their admission should depend on what kind of computer records a proponent seeks to have admitted. For example, computer records that contain text often can be divided into two categories: computer-generated records, and records that are merely computer-stored. See People v. Holowko, 486 N.E.2d 877, 878-79 (Ill. 1985). The difference hinges upon whether a person or a machine created the records' contents. ... As the federal courts develop a more nuanced appreciation of the distinctions to be made between different kinds of computer records, they are likely to see that the admission of computer records generally raises two distinct issues. First, the government must establish the authenticity of all computer records by providing 'evidence sufficient to support a finding that the matter in question is what its proponent claims.' Fed. R. Evid. 901(a). Second, if the computer records are computer-stored records that contain human statements, the government must show that those human statements are not inadmissible hearsay."
CA Describes efforts undertaken at the National Library of New Zealand to ensure preservation of electronic resources.
Phrases
<P1> The National Library Act 1965 provides the legislative framework for the National Library of New Zealand '... to collect, preserve, and make available recorded knowledge, particularly that relating to New Zealand, to supplement and further the work of other libraries in New Zealand, and to enrich the cultural and economic life of New Zealand and its cultural interchanges with other nations.' Legislation currently before Parliament, if enacted, will give the National Library the mandate to collect digital resources for preservation purposes. <warrant> (p. 18) <P2> So, the Library has an organisational commitment and may soon have the legislative environment to support the collection, management and preservation of digital objects. ... The next issue is what needs to be done to ensure that a viable preservation programme can actually be put in place. (p. 18) <P3> As the Library had already begun systematising its approach to resource discovery metadata, development of a preservation metadata schema for use within the Library was a logical next step. (p. 18) <P4> Work on the schema was initially informed by other international endeavours relating to preservation metadata, particularly that undertaken by the National Library of Australia. Initiatives through the CEDARS programme, OCLC/RLG activities and the emerging consensus regarding the role of the OAIS Reference Model ... were also taken into account. <warrant> (p. 18-19) <P5> The Library's Preservation Metadata schema is designed to strike a balance between the principles of preservation metadata, as expressed through the OAIS Information Model, and the practicalities of implementing a working set of preservation metadata. The same incentive informs a recent OCLC/RLG report on the OAIS model. (p. 19) <P6> [I]t is unlikely that anything resembling a comprehensive schema will become available in the short term. However, the need is pressing. (p. 19) <P7> The development of the preservation metadata schema is one component of an ongoing programme of activities needed to ensure the incorporation of digital material into the Library's core business processes with a view to the long-term accessibility of those resources. <warrant> (p. 19) <P8> The aim of the above activities is for the Library to be acknowledged as a 'trusted repository' for digital material which ensures the viability and authenticity of digital objects over time. (p. 20) <P9> The Library will also have to develop relationships with other organisations that might wish to achieve 'trusted repository' status in a country with a small population base and few agencies of appropriate size, funding and willingness to take on the role.
Conclusions
RQ There are still a number of important issues to be resolved before the Library's preservation programme can be deemed a success, including the need for: higher level of awareness of the need for digital preservation within the community of 'memory institutions' and more widely; metrics regarding the size and scope of the problem; finance to research and implement digital preservation; new skill sets for implementing digital preservation, e.g. running the multiplicity of hardware/software involved, digital conservation/archaeology; agreed international approaches to digital preservation; practical models to match the high level conceptual work already undertaken internationally; co-operation/collaboration between the wider range of agents potentially able to assist in developing digital preservation solutions, e.g. the computing industry; and, last but not least, clarity around intellectual property, copyright, privacy and moral rights.
SOW
DC OAIS emerged out of an initiative spearheaded by NASA's Consultative Committee for Space Data Systems. It has been shaped and promoted by the RLG and OCLC. Several international projects have played key roles in shaping the OAIS model and adapting it for use in libraries, archives and research repositories. OAIS-modeled repositories include the CEDARS Project, Harvard's Digital Repository, Koninklijke Bibliotheek (KB), the Library of Congress' Archival Information Package for audiovisual materials, MIT's D-Space, OCLC's Digital Archive and TERM: the Texas Email Repository Model.
Type
Electronic Journal
Title
The Warwick Framework: A container architecture for diverse sets of metadata
This paper is a abbreviated version of The Warwick Framework: A Container Architecture for Aggregating Sets of Metadata. It describes a container architecture for aggregating logically, and perhaps physically, distinct packages of metadata. This "Warwick Framework" is the result of the April 1996 Metadata II Workshop in Warwick U.K.
ISBN
1082-9873
Critical Arguements
CA Describes the Warwick Framework, a proposal for linking together the various metadata schemes that may be attached to a given information object by using a system of "packages" and "containers." "[Warwick Workshop] attendees concluded that ... the route to progress on the metadata issue lay in the formulation a higher-level context for the Dublin Core. This context should define how the Core can be combined with other sets of metadata in a manner that addresses the individual integrity, distinct audiences, and separate realms of responsibility of these distinct metadata sets. The result of the Warwick Workshop is a container architecture, known as the Warwick Framework. The framework is a mechanism for aggregating logically, and perhaps physically, distinct packages of metadata. This is a modularization of the metadata issue with a number of notable characteristics. It allows the designers of individual metadata sets to focus on their specific requirements, without concerns for generalization to ultimately unbounded scope. It allows the syntax of metadata sets to vary in conformance with semantic requirements, community practices, and functional (processing) requirements for the kind of metadata in question. It separates management of and responsibility for specific metadata sets among their respective "communities of expertise." It promotes interoperability by allowing tools and agents to selectively access and manipulate individual packages and ignore others. It permits access to the different metadata sets that are related to the same object to be separately controlled. It flexibly accommodates future metadata sets by not requiring changes to existing sets or the programs that make use of them."
Phrases
<P1> The range of metadata needed to describe and manage objects is likely to continue to expand as we become more sophisticated in the ways in which we characterize and retrieve objects and also more demanding in our requirements to control the use of networked information objects. The architecture must be sufficiently flexible to incorporate new semantics without requiring a rewrite of existing metadata sets. <warrant> <P2> Each logically distinct metadata set may represent the interests of and domain of expertise of a specific community. <P3> Just as there are disparate sources of metadata, different metadata sets are used by and may be restricted to distinct communities of users and agents. <P4> Strictly partitioning the information universe into data and metadata is misleading. <P5> If we allow for the fact that metadata for an object consists of logically distinct and separately administered components, then we should also provide for the distribution of these components among several servers or repositories. The references to distributed components should be via a reliable persistent name scheme, such as that proposed for Universal Resources Names (URNs) and Handles. <P6> [W]e emphasize that the existence of a reliable URN implementation is a necessary to avoid the problems of dangling references that plague the Web. <warrant> <P7> Anyone can, in fact, create descriptive data for a networked resource, without permission or knowledge of the owner or manager of that resource. This metadata is fundamentally different from that metadata that the owner of a resource chooses to link or embed with the resource. We, therefore, informally distinguish between two categories of metadata containers, which both have the same implementation [internally referenced and externally referenced metadata containers].
Conclusions
RQ "We run the danger, with the full expressiveness of the Warwick Framework, of creating such complexity that the metadata is effectively useless. Finding the appropriate balance is a central design problem. ... Definers of specific metadata sets should ensure that the set of operations and semantics of those operations will be strictly defined for a package of a given type. We expect that a limited set of metadata types will be widely used and 'understood' by browsers and agents. However, the type system must be extensible, and some method that allows existing clients and agents to process new types must be a part of a full implementation of the Framework. ... There is a need to agree on one or more syntaxes for the various metadata sets. Even in the context of the relatively simple World Wide Web, the Internet is often unbearably slow and unreliable. Connections often fail or time out due to high load, server failure, and the like. In a full implementation of the Warwick Framework, access to a "document" might require negotiation across distributed repositories. The performance of this distributed architecture is difficult to predict and is prone to multiple points of failure. ... It is clear that some protocol work will need to be done to support container and package interchange and retrieval. ... Some examination of the relationship between the Warwick Framework and ongoing work in repository architectures would likely be fruitful.
Type
Electronic Journal
Title
Collection-Based Persistent Digital Archives - Part 1
The preservation of digital information for long periods of time is becoming feasible through the integration of archival storage technology from supercomputer centers, data grid technology from the computer science community, information models from the digital library community, and preservation models from the archivistÔÇÖs community. The supercomputer centers provide the technology needed to store the immense amounts of digital data that are being created, while the digital library community provides the mechanisms to define the context needed to interpret the data. The coordination of these technologies with preservation and management policies defines the infrastructure for a collection-based persistent archive. This paper defines an approach for maintaining digital data for hundreds of years through development of an environment that supports migration of collections onto new software systems.
ISBN
1082-9873
Critical Arguements
CA "Supercomputer centers, digital libraries, and archival storage communities have common persistent archival storage requirements. Each of these communities is building software infrastructure to organize and store large collections of data. An emerging common requirement is the ability to maintain data collections for long periods of time. The challenge is to maintain the ability to discover, access, and display digital objects that are stored within an archive, while the technology used to manage the archive evolves. We have implemented an approach based upon the storage of the digital objects that comprise the collection, augmented with the meta-data attributes needed to dynamically recreate the data collection. This approach builds upon the technology needed to support extensible database schema, which in turn enables the creation of data handling systems that interconnect legacy storage systems."
Phrases
<P1> The ultimate goal is to preserve not only the bits associated with the original data, but also the context that permits the data to be interpreted. <warrant> <P2> We rely on the use of collections to define the context to associate with digital data. The context is defined through the creation of semi-structured representations for both the digital objects and the associated data collection. <P3>A collection-based persistent archive is therefore one in which the organization of the collection is archived simultaneously with the digital objects that comprise the collection. <P4> The goal is to preserve digital information for at least 400 years. This paper examines the technical issues that must be addressed and presents a prototype implementation. <P5>Digital object representation. Every digital object has attributes that define its structure, physical context, and provenance, and annotations that describe features of interest within the object. Since the set of attributes (such as annotations) will vary across all objects within a collection, a semi-structured representation is needed. Not all digital objects will have the same set of associated attributes. <P6> If possible, a common information model should be used to reference the attributes associated with the digital objects, the collection organization, and the presentation interface. An emerging standard for a uniform data exchange model is the eXtended Markup Language (XML). <P7> A particular example of an information model is the XML Document Type Definition (DTD) which provides a description for the allowed nesting structure of XML elements. Richer information models are emerging such as XSchema (which provides data types, inheritance, and more powerful linking mechanisms) and XMI (which provides models for multiple levels of data abstraction). <P8> Although XML DTDs were originally applied to documents only, they are now being applied to arbitrary digital objects, including the collections themselves. More generally, OSDs can be used to define the structure of digital objects, specify inheritance properties of digital objects, and define the collection organization and user interface structure. <P9> A persistent collection therefore needs the following components of an OSD to completely define the collection context: Data dictionary for collection semantics; Digital object structure; Collection structure; and User interface structure. <P10> The re-creation or instantiation of the data collection is done with a software program that uses the schema descriptions that define the digital object and collection structure to generate the collection. The goal is to build a generic program that works with any schema description. <P11> The information for which driver to use for access to a particular data set is maintained in the associated Meta-data Catalog (MCAT). The MCAT system is a database containing information about each data set that is stored in the data storage systems. <P12> The data handling infrastructure developed at SDSC has two components: the SDSC Storage Resource Broker (SRB) that provides federation and access to distributed and diverse storage resources in a heterogeneous computing environment, and the Meta-data Catalog (MCAT) that holds systemic and application or domain-dependent meta-data about the resources and data sets (and users) that are being brokered by the SRB. <P13> A client does not need to remember the physical mapping of a data set. It is stored as meta-data associated with the data set in the MCAT catalog. <P14> A characterization of a relational database requires a description of both the logical organization of attributes (the schema), and a description of the physical organization of attributes into tables. For the persistent archive prototype we used XML DTDs to describe the logical organization. <P15> A combination of the schema and physical organization can be used to define how queries can be decomposed across the multiple tables that are used to hold the meta-data attributes. <P16> By using an XML-based database, it is possible to avoid the need to map between semi-structured and relational organizations of the database attributes. This minimizes the amount of information needed to characterize a collection, and makes the re-creation of the database easier. <warrant> <P17> Digital object attributes are separated into two classes of information within the MCAT: System-level meta-data that provides operational information. These include information about resources (e.g., archival systems, database systems, etc., and their capabilities, protocols, etc.) and data objects (e.g., their formats or types, replication information, location, collection information, etc.); Application-dependent meta-data that provides information specific to particular data sets and their collections (e.g., Dublin Core values for text objects). <P18> Internally, MCAT keeps schema-level meta-data about all of the attributes that are defined. The schema-level attributes are used to define the context for a collection and enable the instantiation of the collection on new technology. <P19> The logical structure should not be confused with database schema and are more general than that. For example, we have implemented the Dublin Core database schema to organize attributes about digitized text. The attributes defined in the logical structure that is associated with the Dublin Core schema contains information about the subject, constraints, and presentation formats that are needed to display the schema along with information about its use and ownership. <P20> The MCAT system supports the publication of schemata associated with data collections, schema extension through the addition or deletion of new attributes, and the dynamic generation of the SQL that corresponds to joins across combinations of attributes. <P21> By adding routines to access the schema-level meta-data from an archive, it is possible to build a collection-based persistent archive. As technology evolves and the software infrastructure is replaced, the MCAT system can support the migration of the collection to the new technology.
Conclusions
RQ Collection-Based Persistent Digital Archives - Part 2
SOW
DC "The technology proposed by SDSC for implementing persistent archives builds upon interactions with many of these groups. Explicit interactions include collaborations with Federal planning groups, the Computational Grid, the digital library community, and individual federal agencies." ... "The data management technology has been developed through multiple federally sponsored projects, including the DARPA project F19628-95-C-0194 "Massive Data Analysis Systems," the DARPA/USPTO project F19628-96-C-0020 "Distributed Object Computation Testbed," the Data Intensive Computing thrust area of the NSF project ASC 96-19020 "National Partnership for Advanced Computational Infrastructure," the NASA Information Power Grid project, and the DOE ASCI/ASAP project "Data Visualization Corridor." Additional projects related to the NSF Digital Library Initiative Phase II and the California Digital Library at the University of California will also support the development of information management technology. This work was supported by a NARA extension to the DARPA/USPTO Distributed Object Computation Testbed, project F19628-96-C-0020."
Type
Electronic Journal
Title
Collection-Based Persistent Digital Archives - Part 2
"Collection-Based Persistent Digital Archives: Part 2" describes the creation of a one million message persistent E-mail collection. It discusses the four major components of a persistent archive system: support for ingestion, archival storage, information discovery, and presentation of the collection. The technology to support each of these processes is still rapidly evolving, and opportunities for further research are identified.
ISBN
1082-9873
Critical Arguements
CA "The multiple migration steps can be broadly classified into a definition phase and a loading phase. The definition phase is infrastructure independent, whereas the loading phase is geared towards materializing the processes needed for migrating the objects onto new technology. We illustrate these steps by providing a detailed description of the actual process used to ingest and load a million-record E-mail collection at the San Diego Supercomputer Center (SDSC). Note that the SDSC processes were written to use the available object-relational databases for organizing the meta-data. In the future, it may be possible to go directly to XML-based databases."
Phrases
<P1> The processes used to ingest a collection, transform it into an infrastructure independent form, and store the collection in an archive comprise the persistent storage steps of a persistent archive. The processes used to recreate the collection on new technology, optimize the database, and recreate the user interface comprise the retrieval steps of a persistent archive. <P2> In order to build a persistent collection, we consider a solution that "abstracts" all aspects of the data and its preservation. In this approach, data object and processes are codified by raising them above the machine/software dependent forms to an abstract format that can be used to recreate the object and the processes in any new desirable forms. <P3> The SDSC infrastructure uses object-relational databases to organize information. This makes data ingestion more complex by requiring the mapping of the XML DTD semi-structured representation onto a relational schema. <P4> The SDSC infrastructure uses object-relational databases to organize information. This makes data ingestion more complex by requiring the mapping of the XML DTD semi-structured representation onto a relational schema. <P5> The steps used to store the persistent archive were: (1) Define Digital Object: define meta-data, define object structure (OBJ-DTD) --- (A), define object DTD to object DDL mapping --- (B) (2) Define Collection: define meta-data, define collection structure (COLL-DTD) --- (C), define collection DTD structure to collection DDL mapping --- (D) (3) Define Containers: define packing format for encapsulating data and meta-data (examples are the AIP standard, Hierarchical Data Format, Document Type Definition) <P5> In the ingestion phase, the relational and semi-structured organization of the meta-data is defined. No database is actually created, only the mapping between the relational organization and the object DTD. <P6> Note that the collection relational organization does not have to encompass all of the attributes that are associated with a digital object. Separate information models are used to describe the objects and the collections. It is possible to take the same set of digital objects and form a new collection with a new relational organization. <P7> Multiple communities across academia, the federal government, and standards groups are exploring strategies for managing very large archives. The persistent archive community needs to maintain interactions with these communities to track development of new strategies for data management and storage. <warrant> <P8>
Conclusions
RQ "The four major components of the persistent archive system are support for ingestion, archival storage, information discovery, and presentation of the collection. The first two components focus on the ingestion of data into collections. The last two focus on access to the resulting collections. The technology to support each of these processes is still rapidly evolving. Hence consensus on standards has not been reached for many of the infrastructure components. At the same time, many of the components are active areas of research. To reach consensus on a feasible collection-based persistent archive, continued research and development is needed. Examples of the many related issues are listed below:
Type
Electronic Journal
Title
Buckets: A new digital technology for preserving NASA research
CA Buckets are information objects designed to reduce dependency on traditional archives and database systems thereby making them more resilent to the transient nature of information systems.
Phrases
Another focus of aggregation was including the metadata with data. Through experiences NASA researchers found that metadata tended to "drift" over time, becoming decoupled from the data it described or locked in specific DL systems and hard to extract or share with other systems. (p. 377) Buckets are designed to imbue the information objects with certain responsibilities, such as display, dissemination, protection, and maintenance of its contents. As such, buckets should be able to work with many DL systems simultaneously, and minimize or eliminate the necessary modification of DL systems to work with buckets. Ideally, buckets should work with everything and break nothing. This philosophy is formalized in the SODA DL model. the objects become "smarter" at the expense of the archives (that become "dumber"), as functionalities generally associated with archives are moved into the data objects themselves. (p. 390)
Conclusions
RQ The creation of high quality tools for bucket creation and administration is absolutely necessary. The extension of authentication and security measures is key to supporting more technologies. Many applications of this sort of information object independence remains to be explored.
Type
Electronic Journal
Title
A Metadata Framework Developed at the Tsinghua University Library to Aid in the Preservation of Digital Resources
This article provides an overview of work completed at Tsinghua University Library in which a metadata framework was developed to aid in the preservation of digital resources. The metadata framework is used for the creation of metadata to describe resources, and includes an encoding standard used to store metadata and resource structures in information systems. The author points out that the Tsinghua University Library metadata framework provides a successful digital preservation solution that may be an appropriate solution for other organizations as well.
Notes
Well laid out diagrams show the structural layers of resources; encoding exampes are included also.
ISBN
1082-9873
DOI
10.1045/november2002-niu
Critical Arguements
CA The author delineates the metadata schema implemented at Tsinghua University Library which allows for resource description and preservation.
Type
Electronic Journal
Title
Search for Tomorrow: The Electronic Records Research Program of the U.S. National Historical Publications and Records Commission
The National Historical Publications and Records Commission (NHPRC) is a small grant-making agency affiliated with the U.S. National Archives and Records Administration. The Commission is charged with promoting the preservation and dissemination of documentary source materials to ensure an understanding of U.S. history. Recognizing that the increasing use of computers created challenges for preserving the documentary record, the Commission adopted a research agenda in 1991 to promote research and development on the preservation and continued accessibility of documentary materials in electronic form. From 1991 to the present the Commission awarded 31 grants totaling $2,276,665 for electronic records research. Most of this research has focused on two issues of central concern to archivists: (1) electronic record keeping (tools and techniques to manage electronic records produced in an office environment, such as word processing documents and electronic mail), and (2) best practices for storing, describing, and providing access to all electronic records of long-term value. NHPRC grants have raised the visibility of electronic records issues among archivists. The grants have enabled numerous archives to begin to address electronic records problems, and, perhaps most importantly, they have stimulated discussion about electronic records among archivists and records managers.
Publisher
Elsevier Science Ltd
Critical Arguements
CA "The problem of maintaining electronic records over time is big, expensive, and growing. A task force on digital archives established by the Commission on Preservation and Access in 1994 commented that the life of electronic records could be characterized in the same words Thomas Hobbes once used to describe life: ÔÇ£nasty, brutish, and shortÔÇØ [1]. Every day, thousands of new electronic files are created on federal, state, and local government computers across the nation. A small but important portion of these records will be designated for permanent retention. Government agencies are increasingly relying on computers to maintain information such as census files, land titles, statistical data, and vital records. But how should electronic records with long-term value be maintained? Few government agencies have developed comprehensive policies for managing current electronic records, much less preserving those with continuing value for historians and other researchers. Because of this serious and growing problem, the National Historical Publications and Records Commission (NHPRC), a small grantmaking agency affiliated with the U.S. National Archives and Records Administration (NARA), has been making grants for research and development on the preservation and use of electronic documentary sources. The program is conducted in concert with NARA, which in 1996 issued a strategic plan that gives high priority to mastering electronic records problems in partnership with federal government agencies and the NHPRC.
Phrases
<P1> How can data dictionaries, information resource directory systems, and other metadata systems be used to support electronic records management and archival requirements? <P2> In spite of the number of projects the Commission has supported, only four questions from the research agenda have been addressed to date. Of these, the question relating to requirements for the development of data dictionaries and other metadata systems (question number four) has produced a single grant for a state information locator system in South Carolina, and the question relating to needs for archival education (question 10) has led to two grants to the Society of American Archivists for curricular materials. <P3> Information systems created without regard for these considerations may have deficiencies that limit the usefulness of the records contained on them. <warrant> <P4> The NHPRC has awarded major grants to four institutions over the past five years for projects to develop and test requirements for electronic record keeping: University of Pittsburgh (1993): A working set of functional requirements and metadata specifications for electronic record keeping systems; City of Philadelphia (1995, 1996, and 1997): A project to incorporate a subset of the Pittsburgh metadata specifications into a new human resources information system and other city systems as test cases and to develop comprehensive record keeping policies and standards for the cityÔÇÖs information technology systems; Indiana University (1995): A project to develop an assessment tool and methodology for analyzing existing electronic records systems, using the Pittsburgh functional requirements as a model and the student academic record system and a financial system as test cases; Research Foundation of the State University of New York-Albany, Center for Technology in Government (1996): A project to identify best practices for electronic record keeping, including work by the U.S. Department of Defense and the University of British Columbia in addition to the University of Pittsburgh. The Center is working with the stateÔÇÖs Adirondack Parks Agency in a pilot project to develop a system model for incorporating record keeping and archival considerations into the creation of networked computing and communications applications. <P5> No definitive solution has yet been identified for the problems posed by electronic records, although progress has been made in learning what will be needed to design functional electronic record keeping systems. <P6> With the proliferation of digital libraries, the need for long-term storage, migration and retrieval strategies for electronic information has become a priority for a wide variety of information providers. <warrant>
Conclusions
RQ "How best to preserve existing and future electronic formats and provide access to them over time has remained elusive. The answers cannot be found through theoretical research alone, or even through applied research, although both are needed. Answers can only emerge over time as some approaches prove able to stand the test of time and others do not. The problems are large because the costs of maintaining, migrating, and retrieving electronic information continue to be high." ... "Perhaps most importantly, these grants have stimulated widespread discussion of electronic records issues among archivists and record managers, and thus they have had an impact on the preservation of the electronic documentary record that goes far beyond the CommissionÔÇÖs investment."
SOW
DC The National Historic Publications and Records Commission (NHPRC) is the outreach arm of the National Archives and makes plans for and studies issues related to the preservation, use and publication of historical documents. The Commission also makes grants to non-Federal archives and other organizations to promote the preservation use of America's documentary heritage.
Type
Electronic Journal
Title
Metadata: The right approach, An integrated model for descriptive and rights metadata in E-commerce
If you've ever completed a large and difficult jigsaw puzzle, you'll be familiar with that particular moment of grateful revelation when you find that two sections you've been working on separately actually fit together. The overall picture becomes coherent, and the task at last seems achievable. Something like this seems to be happening in the puzzle of "content metadata." Two communities -- rights owners on one hand, libraries and cataloguers on the other -- are staring at their unfolding data models and systems, knowing that somehow together they make up a whole picture. This paper aims to show how and where they fit.
ISBN
1082-9873
Critical Arguements
CA "This paper looks at metadata developments from this standpoint -- hence the "right" approach -- but does so recognising that in the digital world many Chinese walls that appear to separate the bibliographic and commercial communities are going to collapse." ... "This paper examines three propositions which support the need for radical integration of metadata and rights management concerns for disparate and heterogeneous materials, and sets out a possible framework for an integrated approach. It draws on models developed in the CIS plan and the DOI Rights Metadata group, and work on the ISRC, ISAN, and ISWC standards and proposals. The three propositions are: DOI metadata must support all types of creation; The secure transaction of requests and offers data depends on maintaining an integrated structure for documenting rights ownership agreements; All elements of descriptive metadata (except titles) may also be elements of agreements. The main consequences of these propositions are: A cross-sector vocabulary is essential; Non-confidential terms of rights ownership agreements must be generally accessible in a standard form. (In its purest form, the e-commerce network must be able to automatically determine the current owner of any right in any creation for any territory.); All descriptive metadata values (except titles) must be stored as unique, coded values. If correct, the implications of these propositions on the behaviour, and future inter-dependency, of the rights-owning and bibliographic communities are considerable."
Phrases
<P1> Historically, metadata -- "data about data" -- has been largely treated as an afterthought in the commercial world, even among rights owners. Descriptive metadata has often been regarded as the proper province of libraries, a battlefield of competing systems of tags and classification and an invaluable tool for the discovery of resources, while "business" metadata lurked, ugly but necessary, in distribution systems and EDI message formats. Rights metadata, whatever it may be, may seem to have barely existed in a coherent form at all. <P2> E-commerce offers the opportunity to integrate the functions of discovery, access, licensing and accounting into single point-and-click actions in which metadata is a critical agent, a glue which holds the pieces together. <warrant> <P3> E-commerce in rights will generate global networks of metadata every bit as vital as the networks of optical fibre -- and with the same requirements for security and unbroken connectivity. <warrant> <P4> The sheer volume and complexity of future rights trading in the digital environment will mean that any but the most sporadic level of human intervention will be prohibitively expensive. Standardised metadata is an essential component. <warrant> <P5> Just as the creators and rights holders are the sources of the content for the bibliographic world, so it seems inevitable they will become the principal source of core metadata in the web environment, and that metadata will be generated simultaneously and at source to meet the requirements of discovery, access, protection, and reward. <P6> However, under the analysis being carried out within the communities identified above and by those who are developing technology and languages for rights-based e-commerce, it is becoming clear that "functional" metadata is also a critical component. It is metadata (including identifiers) which defines a creation and its relationship to other creations and to the parties who created and variously own it; without a coherent metadata infrastructure e-commerce cannot properly flow. Securing the metadata network is every bit as important as securing the content, and there is little doubt which poses the greater problem. <warrant> <P7> Because creations can be nested and modified at an unprecedented level, and because online availability is continuous, not a series of time-limited events like publishing books or selling records, dynamic and structured maintenance of rights ownership is essential if the currency and validity of offers is to be maintained. <warrant> <P8> Rights metadata must be maintained and linked dynamically to all of its related content. <P9> A single, even partial, change to rights ownership in the original creation needs to be communicated through this chain to preserve the currency of permissions and royalty flow. There are many options for doing this, but they all depend, among other things, on the security of the metadata network. <warrant> <P10>As digital media causes copyright frameworks to be rewritten on both sides of the Atlantic, we can expect measures of similar and greater impact at regular intervals affecting any and all creation types: yet such changes can be relatively simple to implement if metadata is held in the right way in the right place to begin with. <warrant> <P11> The disturbing but inescapable consequence is that it is not only desirable but essential for all elements of descriptive metadata, except for titles, to be expressed at the outset as structured and standardised values to preserve the integrity of the rights chain. <P12> Within the DOI community, which embraces commercial and library interests, the integration of rights and descriptive metadata has become a matter of priority. <P13> What is required is that the establishment of a creation description (for example, the registration of details of a new article or audio recording) or of change of rights control (for example, notification of the acquisition of a work or a catalogue of works) can be done in a standardised and fully structured way. <warrant> <P14> Unless the chain is well maintained at source, all downstream transactions will be jeopardised, for in the web environment the CIS principle of "do it once, do it right" is seen at its ultimate. A single occurrence of a creation on the web, and its supporting metadata, can be the source for all uses. <P15> One of the tools to support this development is the RDF (Resource Description Framework). RDF provides a means of structuring metadata for anything, and it can be expressed in XML. <P16> Although formal metadata standards hardly exist within ISO, they are appearing through the "back door" in the form of mandatory supporting data for identifier standards such as ISRC, ISAN and ISWC. A major function of the INDECS project will be to ensure the harmonisation of these standards within a single framework. <P17> In an automated, protected environment, this requires that the rights transaction is able to generate automatically a new descriptive metadata set through the interaction of the agreement terms with the original creation metadata. This can only happen (and it will be required on a massive scale) if rights and descriptive metadata terminology is integrated and standardised. <warrant> <P18>As resources become available virtually, it becomes as important that the core metadata itself is not tampered with as it is that the object itself is protected. Persistence is now not only a necessary characteristic of identifiers but also of the structured metadata that attends them. <P19> This leads us also to the conclusion that, ideally, standardised descriptive metadata should be embedded into objects for its own protection. <P20> It also leads us to the possibility of metadata registration authorities, such as the numbering agencies, taking wider responsibilities. <P21>If this paper is correct in its propositions, then rights metadata will have to rewrite half of Dublin Core or else ignore it entirely. <P22> The web environment with its once-for-all means of access provides us with the opportunity to eliminate duplication and fragmentation of core metadata; and at this moment, there are no legacy metadata standards to shackle the information community. We have the opportunity to go in with our eyes open with standards that are constructed to make the best of the characteristics of the new digital medium. <warrant>
Conclusions
RQ "The INDECS project (assuming its formal adoption next month), in which the four major communities are active, and with strong links to ISO TC46 and MPEG, will provide a cross-sector framework for this work in the short-term. The DOI Foundation itself may be an appropriate umbrella body in the future. We may also consider that perhaps the main function of the DOI itself may not be, as originally envisaged, to link user to content -- which is a relatively trivial task -- but to provide the glue to link together creation, party, and agreement metadata. The model that rights owners may be wise to follow in this process is that of MPEG, where the technology industry has tenaciously embraced a highly-regimented, rolling standardisation programme, the results of which are fundamental to the success of each new generation of products. Metadata standardisation now requires the same technical rigour and commercial commitment. However, in the meantime the bibliographic world, working on what it has always seen its own part of the jigsaw puzzle, is actively addressing many of these issues in an almost parallel universe. The question remains as to how in practical terms the two worlds, rights and bibliographic, can connect, and what may be the consequences of a prolonged delay in doing so." ... "The former I encourage to make a case for continued support and standardisation of a flawed Dublin Core in the light of the propositions I have set out in this paper, or else engage with the DOI and rights owner communities in its revision to meet the real requirements of digital commerce in its fullest sense."
SOW
DC "There are currently four major active communities of rights-holders directly confronting these questions: the DOI community, at present based in the book and electronic publishing sector; the IFPI community of record companies; the ISAN community embracing producers, users, and rights owners of audiovisuals; and the CISAC community of collecting societies for composers and publishers of music, but also extending into other areas of authors' rights, including literary, visual, and plastic arts." ... "There are related rights-driven projects in the graphic, photographic, and performers' communities. E-commerce means that metadata solutions from each of these sectors (and others) require a high level of interoperability. As the trading environment becomes common, traditional genre distinctions between creation-types become meaningless and commercially destructive."
Type
Electronic Journal
Title
The Dublin Core Metadata Inititiative: Mission, Current Activities, and Future Directions
Metadata is a keystone component for a broad spectrum of applications that are emerging on the Web to help stitch together content and services and make them more visible to users. The Dublin Core Metadata Initiative (DCMI) has led the development of structured metadata to support resource discovery. This international community has, over a period of 6 years and 8 workshops, brought forth: A core standard that enhances cross-disciplinary discovery and has been translated into 25 languages to date; A conceptual framework that supports the modular development of auxiliary metadata components; An open consensus building process that has brought to fruition Australian, European and North American standards with promise as a global standard for resource discovery; An open community of hundreds of practitioners and theorists who have found a common ground of principles, procedures, core semantics, and a framework to support interoperable metadata.
Type
Report
Title
Mapping of the Encoded Archival Description DTD Element Set to the CIDOC CRM
The CIDOC CRM is the first ontology designed to mediate contents in the area of material cultural heritage and beyond, and has been accepted by ISO TC46 as work item for an international standard. The EAD Document Type Definition (DTD) is a standard for encoding archival finding aids using the Standard Generalized Markup Language (SGML). Archival finding aids are detailed guides to primary source material which provide fuller information than that normally contained within cataloging records. 
Publisher
Institute of Computer Science, Foundation for Research and Technology - Hellas
Publication Location
Heraklion, Crete, Greece
Language
English
Critical Arguements
CA "This report describes the semantic mapping of the current EAD DTD Version 1.0 Element Set to the CIDOC CRM and its latest extension. This work represents a proof of concept for the functionality the CIDOC CRM is designed for." 
Conclusions
RQ "Actually, the CRM seems to do the job quite well ÔÇô problems in the mapping arise more from underspecification in the EAD rather than from too domain-specific notions. "┬á... "To our opinion, the archival community could benefit from the conceptualizations of the CRM to motivate more powerful metadata standards with wide interoperability in the future, to the benefit of museums and other disciplines as well."
SOW
DC "As a potential international standard, the EAD DTD is maintained in the Network Development and MARC Standards Office of the Library of Congress in partnership with the Society of American Archivists." ... "The CIDOC Conceptual Reference Model (see [CRM1999], [Doerr99]), in the following only referred to as ┬½CRM┬╗, is outcome of an effort of the Documentation Standards Group of the CIDOC Committee (see ┬½http:/www.cidoc.icom.org┬╗, ÔÇ£http://cidoc.ics.forth.grÔÇØ) of ICOM, the International Council of Museums beginning in 1996."
This document presents the ARTISTE three-level approach to providing an open and flexible solution for combined metadata and image content-based search and retrieval across multiple, distributed image collections. The intended audience for this report includes museum and gallery owners who are interested in providing or extending services for remote access, developers of collection management and image search and retrieval systems, and standards bodies in both the fine art and digital library domains.
Notes
ARTISTE (http://www.artisteweb.org/) is a European Commission supported project that has developed integrated content and metadata-based image retrieval across several major art galleries in Europe. Collaborating galleries include the Louvre in Paris, the Victoria and Albert Museum in London, the Uffizi Gallery in Florence and the National Gallery in London.
Edition
Version 2.0
Publisher
The ARTISTE Consortium
Publication Location
Southampton, United Kindom
Accessed Date
08/24/05
Critical Arguements
<CA>  Over the last two and a half years, ARTISTE has developed an image search and retrieval system that integrates distributed, heterogeneous image collections. This report positions the work achieved in ARTISTE with respect to metadata standards and approaches for open search and retrieval using digital library technology. In particular, this report describes three key aspects of ARTISTE: the transparent translation of local metadata to common standards such as Dublin Core and SIMI consortium attribute sets to allow cross-collection searching; A methodology for combining metadata and image content-based analysis into single search galleries to enable versatile retrieval and navigation facilities within and between gallery collections; and an open interface for cross-collection search and retrieval that advances existing open standards for remote access to digital libraries, such as OAI (Open Archive Initiative) and ZING SRW (Z39.50 International: Next Generation Search and Retrieval Web Service).
Conclusions
RQ "A large part of ARTISTE is concerned with use of existing standards for metadata frameworks. However, one area where existing standards have not been sufficient is multimedia content-based search and retrieval. A proposal has been made to ZING for additions to SRW. This will hopefully enable ARTISTE to make a valued contribution to this rapidly evolving standard." ... "The work started in ARTISTE is being continued in SCULTEUR, another project funded by the European Commission. SCUPLTEUR will develop both the technology and the expertise to create, manage, and present cultural archives of 3D models and associated multimedia objects." ... "We believe the full benefit of multimedia search and retrieval can only be realised through seamless integration of content-based analysis techniques. However, not only does introduction of content-bases analysis require modification to existing standards as outlines in this report, but it also requires a review if the use of semantics in achieving digital library interoperability. In particular, machine understandable description of the semantics of textual metadata, multimedia content, and content-based analysis, can provide a foundation for a new generation of flexible and dynamic digital library tools and services. " ... "Existing standards do not use explicit semantics to describe query operators or their application to metadata and multimedia content at individual sites. However, dynamically determining what operators and types are supported by a collection is essential to robust and efficient cross-collection searching. Dynamic use of published semantics would allow a collection and any associated content-based analysis to be changed  by its owner without breaking conformance to search and retrieval standards. Furthermore, individual sites would not need to publish detailed, human readable descriptions of available functionality.  
SOW
DC "Four major European galleries are involved in the project: the Uffizi in Florence, the national Gallery and the Victoria and Albert Museum in London, and the Centre de Recherche et de Restauration des Musees de France (C2RMF) which is the Louvre related restoration centre. The ARTISTE system currently holds over 160,000 images from four separate collections owned by these partners. The galleries have partnered with NCR, leading player in database and Data Warehouse technology; Interactive Labs, the new media design and development facility of Italy's leading art publishing group, Giunti; IT Innovation, a specialist in building innovative IT systems, and the Department of Electronics and Computer Science at the University of Southhampton." 
Type
Report
Title
Advice: Introduction to the Victorian Electronic Records Strategy (VERS) PROS 99/007 (Version 2)
This document is an introduction to the PROV Standard Management of Electronic Records (PROS 99/007), also known as the VERS Standard. This document provides background information on the goals and the VERS approach to preservation. Nothing in this document imposes any requirements on agencies.
Critical Arguements
CA The Victorian Elextronic Records Strategy (VERS) addresses the cost-effective, long-term preservation of electronic records. The structure and requirements of VERS are formally specified in the STandard for the Management of Electronic Records (PROS 99/007) and its five technical specifications. This Advice provides background to the Standard. It covers: the history of the VERS project; the preservation theory behind VERS; how the five specifications support the preservation theory; a brief introduction to the VERS Encapsulated Object (VEO). In this document we distinguish between the record and the content of the record. The content is the actuial information contained in the record; for example, the report or the image. The record as a whole contains the record content and metadata that contains information about the record, including its context, description, history, and integrity cvontrol. 
Conclusions
<RQ>
SOW
<DC>Public Record Office Victoria is the archives of the State Government of Victoria. They hold records from the beginnings of the colonial administration of Victoria in the mid-1830s to today and are responsible for ensuring the accountability of the Victoria State Government. 
Type
Report
Title
Introduction to the Victoria Electronic Records Strategy (VERS) PROS 99/007 (Version 2)
CA VERS has two major goals: the preservation of electronic records and enabling efficient management in doing so. Version 2 has an improved structure, additional metadata elements, requirements for preservation and compliance requirements for agencies. ÔÇ£ExportÔÇØ compliance allows agencies to maintain their records within their own recordkeeping systems and add a module so they can generate the VERS format for export, especially for long term preservation. ÔÇ£NativeÔÇØ complicance is when records are converted to long term preservation format upon registration which is seen as the ideal approach.
Type
Report
Title
Victorian Electronic Records Strategy: Final Report
This document is the Victorian Electronic Records Strategy (VERS) Standard (PROS 99/007). This document is the standard itself and is primarly concerned with conformance. The technical requirements of the Standard are contained in five Specifications.
Accessed Date
August 24, 2005
Critical Arguements
CA VERS has two major goals: the preservation of electronic records and enabling efficient management in doing so. Version 2 has an improved structure, additional metadata elements, requirements for preservation and compliance requirements for agencies. "Export" compliance allows agencies to maintain their records within their own recordkeeping systems and add a module so they can generate the VERS format for export, especially for long term preservation. "Native" complicance is when records are converted to long term preservation format upon registration which is seen as the ideal approach. ... "The Victorian Electronic Records Strategy (VERS) is designed to assist agencies in managing their electronic records. The strategy focuses on the data or information contained in electronic records, rather than the systems that are used to produce them."
SOW
<DC> "VERS was developed with the assistance of CSIRO, Ernst & Young, the Department of Infrastructure, and records managers across government. The recommendations included in the VERS Final Report1 issued in March 1999 provide a framework for the management of electronic records." ... "Public Records Office Victoria is the Archives of the State of Victoria. They hold the records from the beginnings of the colonial administration of Victoria in the mid-1830s to today.
Type
Report
Title
RLG Best Practice Guidelines for Encoded Archival Description
These award-winning guidelines, released in August 2002, were developed by the RLG EAD Advisory Group to provide practical, community-wide advice for encoding finding aids. They are designed to: facilitate interoperability of resource discovery by imposing a basic degree of uniformity on the creation of valid EAD-encoded documents; encourage the inclusion of particular elements, and; develop a set of core data elements. 
Publisher
Research Libraries Group
Publication Location
Mountain View, CA, USA
Language
English
Critical Arguements
<CA> The objectives of the guidelines are: 1. To facilitate interoperability of resource discovery by imposing a basic degree of uniformity on the creation of valid EAD-encoded documents and to encourage the inclusion of elements most useful for retrieval in a union index and for display in an integrated (cross-institutional) setting; 2. To offer researchers the full benefits of XML in retrieval and display by developing a set of core data elements to improve resource discovery. It is hoped that by identifying core elements and by specifying "best practice" for those elements, these guidelines will be valuable to those who create finding aids, as well as to vendors and tool builders; 3. To contribute to the evolution of the EAD standard by articulating a set of best practice guidelines suitable for interinstitutional and international use. These guidelines can be applied to both retrospective conversion of legacy finding aids and the creation of new finding aids.  
Conclusions
<RQ>
SOW
<DC> "RLG organized the EAD working group as part of our continuing commitment to making archival collections more accessible on the Web. We offer RLG Archival Resources, a database of archival materials; institutions are encouraged to submit their finding aids to this database." ... "This set of guidelines, the second version promulgated by RLG, was developed between October 2001 and August 2002 by the RLG EAD Advisory Group. This group consisted of ten archivists and digital content managers experienced in creating and managing EAD-encoded finding aids at repositories in the United States and the United Kingdom."
1. Also at http://www.access.gpo.gov/uscode/title28a/28a_8_6_.html2. As amended Feb. 28, 1966, eff. July 1, 1966; Mar. 2, 1987, eff. Aug. 1, 1987; Apr. 30, 1991, eff. Dec. 1, 1991.
Type
Web Page
Title
The Electronic Records Strategies Task Force Report: An Australian Perspective
CA The archival profession has a brief window of opportunity to become stakeholders in the realm of electronic records. In order to accomplish that, they must answer not only the "what" but the "why" of recordkeeping in all of its implications.
Conclusions
RQ How will American archivists deal with the re-invention of professional roles that have traditionally been bifurcated by records on one side and archives on the other? Where does continuum thinking leave SAA and its primary constituency of historical archivists?
Type
Web Page
Title
Documenting Business: The Australian Recordkeeping Metadata Schema
In July 1999, the Australian Recordkeeping Metadata Schema (RKMS) was approved by its academic and industry steering group. This metadata set now joins other community specific sets in being available for use and implementation into workplace applications. The RKMS has inherited elements from and built on many other metadata standards associated with information management. It has also contributed to the development of subsequent sector specific recordkeeping metadata sets. The importance of the RKMS as a framework for 'mapping' or reading other sets and also as a standardised set of metadata available for adoption in diverse implementation environments is now emerging. This paper explores the context of the SPIRT Recordkeeping Metadata Project, and the conceptual models developed by the SPIRT Research Team as a framework for standardising and defining Recordkeeping Metadata. It then introduces the elements of the SPIRT Recordkeeping Metadata Schema and explores its functionality before discussing implementation issues with reference to document management and workflow technologies.
Critical Arguements
CA Much of the metadata work done so far has worked off the passive assumption of records as document-like objects. Instead, they need to be seen as active entities in business transactions.
Conclusions
RQ In order to decide which elements are to be used from the RKMS, organizations need to delineate the reach of specific implementations as far as how and when records need to be bound with metadata.
CA This is the first of four articles describing Geospatial Standards and the standards bodies working on these standards. This article will discuss what geospatial standards are and why they matter, identify major standards organizations, and list the characteristics of successful geospatial standards.
Conclusions
RQ Which federal and international standards have been agreed upon since this article's publication?
SOW
DC FGDC approved the Content Standard for Digital Geospatial Metadata (FGDC-STD-001-1998) in June 1998. FGDC is a 19-member interagency committee composed of representatives from the Executive Office of the President, Cabinet-level and independent agencies. The FGDC is developing the National Spatial Data Infrastructure (NSDI) in cooperation with organizations from State, local and tribal governments, the academic community, and the private sector. The NSDI encompasses policies, standards, and procedures for organizations to cooperatively produce and share geographic data.
This portal page provides links to all EAD-related information as applicable to those institutional members of the U.K. Archives Hub. It provides links to Creating EAD records, More about EAD, Reference, and More resources.
Publisher
The Archives Hub
Publication Location
Manchester, England, U.K.
Language
English
Critical Arguements
CA "These pages have been designed to hold links and information which we hope will be useful to archivists and librarians working in the UK Higher and Further Education sectors."
SOW
DC The Archives Hub provides a single point of access to 17,598 descriptions of archives held in UK universities and colleges. At present these are primarily at collection-level, although complete catalogue descriptions are provided where they are available. The Archives Hub forms one part of the UK's National Archives Network, alongside related networking projects. A Steering Committee which includes representatives of contributing institutions, the National Archives and the other archive networks guides the progress of the project. There is also a Contributors' and Users' Forum which provides feedback to aid the development of the service. The service is hosted at MIMAS on behalf of the Consortium of University Research Libraries (CURL) and is funded by the Joint Information Systems Committee (JISC). Systems development work is undertaken at the University of Liverpool.
Type
Web Page
Title
An Assessment of Options for Creating Enhanced Access to Canada's Audio-Visual Heritage
CA "This project was conducted by Paul Audley & Associates to investigate the feasibility of single window access to information about Canada's audio-visual heritage. The project follows on the recommendations of Fading Away, the 1995 report of the Task Force on the Preservation and Enhanced Use of Canada's Audio-Visual Heritage, and the subsequent 1997 report Search + Replay. Specific objectives of this project were to create a profile of selected major databases of audio-visual materials, identify information required to meet user needs, and suggest models for single-window access to audio-visual databases. Documentary research, some 35 interviews, and site visits to organizations in Vancouver, Toronto, Ottawa and Montreal provided the basis upon which the recommendations of this report were developed."
Type
Web Page
Title
JISC/NPO studies on the preservation of electronic materials: A framework of data types and formats, and issues affecting the long term preservation of digital material
CA Proposes a framework for preserving digital objects and discusses steps in the preservation process. Addresses a series of four questions: Why preserve? How much? How? And Where? Proposes a "Preservation Complexity Scorecard" to help identify the complexity of preservation needs and the appropriate preservation approach for a given object. "Although a great deal has been discussed and written about digital material preservation, there would appear to be no overall structure which brings together the findings of the numerous contributors to the debate, and allows them to be compared. This Report attempts to provide such a structure, whereby it should be possible to identify the essential elements of the preservation debate and to determine objectively the criticality of the other unresolved issues. This Report attempts to identify the most critical issues and employ them in order to determine their affect [sic] on preservation practice." (p. 5)
Conclusions
RQ "The study concludes that the overall management task in long term preservation is to moderate the pressure to preserve (Step 1) with the constraints dictated by a cost-effective archive (Step 3). This continuing process of moderation is documented through the Scorecard." (p. 6) "The Study overall recommends that a work programme should be started to: (a) Establish a Scorecard approach (to measure preservation complexity), (b) Establish an inventory of archive items (with complexity ratings) and (c) Establish a Technology Watch (to monitor shifts in technology), in order to be able to manage technological change. And in support of this, (a) establish a programme of work to explore the interaction of stakeholders and a four level contextual mode in the preservation process." (p. 6) A four level contextual approach, with data dictionary entry definitions, should be built in order to provide an information structure that will permit the successful retrieval and interpretation of an object in 50 years time. A study should be established to explore the principle of encapsulating documentsusing the four levels of context, stored in a format, possibly encrypted, that can be transferred across technologies and over time. <warrant> (p. 31) A more detailed study should be made of the inter-relationships of the ten stakeholders, and how they can be made to support the long term preservation of digital material. This will be linked to the economics of archive management (the cost model), changes in legislation (Legal Deposit, etc.), the risks of relying on links between National Libraries to maintain collections (threats of wholesale destruction of collections), and loss through viruses (technological turbulence). (p. 36) A technology management trail (within the Scorecard -- see Step 2 of the Framework) should be established before the more complex digital material is stored. This is to ensure that, for an item of digital material, the full extent of the internal interrelationships are understood, and the implications for long term preservation in a variety of successive environments are documented. (p. 37)
SOW
DC "The study is part of a wider programme of studies, funded by the Joint Information Systems Committee ("JISC"). The programme was initiated as a consequence of a two day workshop at Warwick University, in late November 1995. The workshop addressed the Long Term Preservation of Electronic Materials. The attendees represented an important cross-section of academic, librarian, curatorial, managerial and technological interests. 18 potential action points emerged, and these were seen as a basis for initiating further activity. After consultation, JISC agreed to fund a programme of studies." (p. 7) "The programme of studies is guided by the Digital Archive Working Group, which reports to the Management Committee of the National Preservation Office. The programme is administered by the British Library Research and Innovation Centre." (p. 2)
Type
Web Page
Title
Archiving The Avant Garde: Documenting And Preserving Variable Media Art.
Archiving the Avant Garde is a collaborative project to develop, document, and disseminate strategies for describing and preserving non-traditional, intermedia, and variable media art forms, such as performance, installation, conceptual, and digital art. This joint project builds on existing relationships and the previous work of its founding partners in this area. One example of such work is the Conceptual & Intermedia Arts Online (CIAO) Consortium, a collaboration founded by the BAM/PFA, the Walker Art Center, and Franklin Furnace, that includes 12 other international museums and arts organizations. CIAO develops standardized methods of documenting and providing access to conceptual and other ephemeral intermedia art forms. Another example of related work conducted by the project's partners is the Variable Media Initiative, organized by the Guggenheim Museum, which encourages artists to define their work independently from medium so that the work can be translated once its current medium is obsolete. Archiving the Avant Garde will take the ideas developed in previous efforts and develop them into community-wide working strategies by testing them on specific works of art in the practical working environments of museums and arts organizations. The final project report will outline a comprehensive strategy and model for documenting and preserving variable media works, based on case studies to illustrate practical examples, but always emphasizing the generalized strategy behind the rule. This report will be informed by specific and practical institutional practice, but we believe that the ultimate model developed by the project should be based on international standards independent of any one organization's practice, thus making it adaptable to many organizations. Dissemination of the report, discussed in detail below, will be ongoing and widespread.
Critical Arguements
CA "Works of variable media art, such as performance, installation, conceptual, and digital art, represent some of the most compelling and significant artistic creation of our time. These works are key to understanding contemporary art practice and scholarship, but because of their ephemeral, technical, multimedia, or otherwise variable natures, they also present significant obstacles to accurate documentation, access, and preservation. The works were in many cases created to challenge traditional methods of art description and preservation, but now, lacking such description, they often comprise the more obscure aspects of institutional collections, virtually inaccessible to present day researchers. Without strategies for cataloging and preservation, many of these vital works will eventually be lost to art history. Description of and access to art collections promote new scholarship and artistic production. By developing ways to catalog and preserve these collections, we will both provide current and future generations the opportunity to learn from and be inspired by the works and ensure the perpetuation and accuracy of art historical records. It is to achieve these goals that we are initiating the consortium project Archiving the Avant Garde: Documenting and Preserving Variable Media Art."
Conclusions
RQ "Archiving the Avant Garde will take a practical approach to solving problems in order to ensure the feasibility and success of the project. This project will focus on key issues previously identified by the partners and will leave other parts of the puzzle to be solved by other initiatives and projects in regular communication with this group. For instance, this project realizes that the arts community will need to develop software tools which enable collections care professionals to implement the necessary new description and metadata standards, but does not attempt to develop such tools in the context of this project. Rather, such tools are already being developed by a separate project under MOAC. Archiving the Avant Garde will share information with that project and benefit from that work. Similarly, the prospect of developing full-fledged software emulators is one best solved by a team of computer scientists, who will work closely with members of the proposed project to cross-fertilize methods and share results. Importantly, while this project is focused on immediate goals, the overall collaboration between the partner organizations and their various initiatives will be significant in bringing together the computer science, arts, standards, and museum communities in an open-source project model to maximize collective efforts and see that the benefits extend far and wide."
SOW
DC "We propose a collaborative project that will begin to establish such professional best practice. The collaboration, consisting of the Berkeley Art Museum and Pacific Film Archive (BAM/PFA), the Solomon R. Guggenheim Museum, Rhizome.org, the Franklin Furnace Archive, and the Cleveland Performance Art Festival and Archive, will have national impact due to the urgent and universal nature of the problem for contemporary art institutions, the practicality and adaptability of the model developed by this group, and the significant expertise that this nationwide consortium will bring to bear in the area of documenting and preserving variable media art." ... "We believe that a model informed by and tested in such diverse settings, with broad public and professional input (described below), will be highly adaptable." ..."Partners also represent a geographic and national spread, from East Coast to Midwest to West Coast. This coverage ensures that a wide segment of the professional community and public will have opportunities to participate in public forums, hosted at partner institutions during the course of the project, intended to gather an even broader cross-section of ideas and feedback than is represented by the partners." ... "The management plan for this project will be highly decentralized ensuring that no one person or institution will unduly influence the model strategy for preserving variable media art and thereby reduce its adaptability."
CA "The purpose of this document is: (1) To provide a better understanding of the functionality that the MPEG-21 multimedia framework should be capable of providing; (2) To offer high level descriptions of different MPEG-21 applications against which the formal requirements for MPEG-21 can be checked; (3) To act as a basis for devising Core Experiments which establish proof of concept; (4) To provide a point of reference to support the evaluation of responses submitted against ongoing MPEG-21 Calls for Proposals; (5) To be a 'Public Relations' instrument that can help to explain what MPEG-21 is about."
Conclusions
RQ not applicable
SOW
DC The Moving Picture Experts Group (MPEG) is a working group of ISO/IEC, made up of some 350 members from various industries and universities, in charge of the development of international standards for compression, decompression, processing, and coded representation of moving pictures, audio and their combination. MPEG's official designation is ISO/IEC JTC1/SC29/WG11. So far MPEG has produced the following compression formats and ancillary standards: MPEG-1, the standard for storage and retrieval of moving pictures and audio on storage media (approved Nov. 1992); MPEG-2, the standard for digital television (approved Nov. 1994); MPEG-4, the standard for multimedia applications; MPEG-7, the content representation standard for multimedia information search, filtering, management and processing; and MPEG-21, the multimedia framework.
CA Discussion of the challenges faced by librarians and archivists who must determine which and how much of the mass amounts of digitally recorded sound materials to preserve. Identifies various types of digital sound formats and the varying standards to which they are created. Specific challenges discussed include copyright issues; technologies and platforms; digitization and preservation; and metadata and other standards.
Conclusions
RQ "Whether between record companies and archives or with others, some type of collaborative approach to audio preservation will be necessary if significant numbers of audio recordings at risk are to be preserved for posterity. ... One particular risk of preservation programs now is redundancy. ... Inadequate cataloging is a serious impediment to preservation efforts. ... It would be useful to archives, and possibly to intellectual property holders as well, if archives could use existing industry data for the bibliographic control of published recordings and detailed listings of the music recorded on each disc or tape. ... Greater collaboration between libraries and the sound recording industry could result in more comprehensive catalogs that document recording sessions with greater specificity. With access to detailed and authoritative information about the universe of published sound recordings, libraries could devote more resources to surveying their unpublished holdings and collaborate on the construction of a preservation registry to help reduce preservation redundancy. ... Many archivists believe that adequate funding for preservation will not be forthcoming unless and until the recordings preserved can be heard more easily by the public. ... If audio recordings that do not have mass appeal are to be preserved, that responsibility will probably fall to libraries and archives. Within a partnership between archives and intellectual property owners, archives might assume responsibility for preserving less commercial music in return for the ability to share files of preserved historical recordings."
CA Databases have structure, and annotating them (with metadata for example) can be difficult. Work with semistructured data models in conjunction with the use of XML may solve this problem of accommodating unanticipated structures in databases.
Conclusions
RQ It's critical to develop tools to help data curators record and repeat the corrections they make.
Type
Web Page
Title
CDL Digital Object Standard: Metadata, Content and Encoding
This document addresses the standards for digital object collections for the California Digital Library 1. Adherence to these standards is required for all CDL contributors and may also serve University of California staff as guidelines for digital object creation and presentation. These standards are not intended to address all of the administrative, operational, and technical issues surrounding the creation of digital object collections.
Critical Arguements
CA These standards describe the file formats, storage and access standards for digital objects created by or incorporated into the CDL as part of the permanent collections. They attempt to balance adherence to industry standards, reproduction quality, access, potential longevity and cost.
Conclusions
RQ not applicable
SOW
DC "This is the first version of the CDL Digital Object Standard. This version is based upon the September 1, 1999 version of the CDL's Digital Image Standard, which included recommendations of the Museum Educational Site Licensing Project (MESL), the Library of Congress and the MOA II participants." ... "The Museum Educational Site Licensing Project (MESL) offered a framework for seven collecting institutions, primarily museums, and seven universities to experiment with new ways to distribute visual information--both images and related textual materials. " ... "The Making of America (MoA II) Testbed Project is a Digital Library Federation (DLF) coordinated, multi-phase endeavor to investigate important issues in the creation of an integrated, but distributed, digital library of archival materials (i.e., digitized surrogates of primary source materials found in archives and special collections). The participants include Cornell University, New York Public Library, Pennsylvania State University, Stanford University and UC Berkeley. The Library of Congress white papers and standards are based on the experience gained during the American Memory Pilot Project. The concepts discussed and the principles developed still guide the Library's digital conversion efforts, although they are under revision to accomodate the capabilities of new technologies and new digital formats." ... "The CDL Technical Architecture and Standards Workgroup includes the following members with extensive experience with digital object collection and management: Howard Besser, MESL and MOA II digital imaging testbed projects; Diane Bisom, University of California, Irvine; Bernie Hurley, MOA II, University of California, Berkeley; Greg Janee, Alexandria Digital Library; John Kunze, University of California, San Francisco; Reagan Moore and Chaitanya Baru, San Diego Supercomputer Center, ongoing research with the National Archives and Records Administration on the long term storage and retrieval of digital content; Terry Ryan, University of California, Los Angeles; David Walker, California Digital Library"
Type
Web Page
Title
Update on the National Digital Infrastructure Initiative
CA Describes progress on a five-year national strategy for preserving digital content.
Conclusions
RQ "These sessions helped us set priorities. Participants agreed about the need for a national preservation strategy. People from industry were receptive to the idea that the public good, as well as their own interests, would be served by coming together to think about long-term preservation. <warrant> They also agreed on the need for some form of distributor-decentralized solution. Like others, they realize that no library can tackle the digital preservation challenge alone. Many parties will need to come together. Participants agreed about the need for digital preservation research, a clearer agenda, a better focus, and a greater appreciation that technology is not necessarily the prime focus. The big challenge might be organizational architecture, i.e., roles and responsibilities. Who is going to do what? How will we reach agreement?"
There are many types of standards used to manage museum collections information. These "standards", which range from precise technical  standards to general guidelines, enable museum data to be efficiently  and consistently indexed, sorted, retrieved, and shared, both  in automated and paper-based systems. Museums often use metadata standards  (also called data structure standards) to help them: define what types of information to record in their database  (or card catalogue); structure this information (the relationships between the  different types of information). Following (or mapping data to) these standards makes it possible  for museums to move their data between computer systems, or share  their data with other organizations.
Notes
The CHIN Web site features sections dedicated to Creating and Managing Digital Content, Intellectual Property, Collections Management, Standards, and more. CHIN's array of training tools, online publications, directories and databases are especially designed to meet the needs of both small and large institutions. The site also provides access to up-to-date information on topics such as heritage careers, funding and conferences.
Critical Arguements
CA "Museums often want to use their collections data for many purposes, (exhibition catalogues, Web access for the public, and curatorial research, etc.), and they may want to share their data with other museums, archives, and libraries in an automated way. This level of interoperability between systems requires cataloguing standards, value standards, metadata standards, and interchange standards to work together. Standards enable the interchange of data between cataloguer and searcher, between organizations, and between computer systems."
Conclusions
RQ "HIN is also involved in a project to create metadata for a pan-Canadian inventory of learning resources available on Canadian museum Web sites. Working in consultation with the Consortium for the Interchange of Museum Information (CIMI), the Gateway to Educational Materials (GEM) [link to GEM in Section G], and SchoolNet, the project involves the creation of a Guide to Best Practices and cataloguing tool for generating metadata for online learning materials. " 
SOW
DC "CHIN is involved in the promotion, production, and analysis of standards for museum information. The CHIN Guide to Museum Documentation Standards includes information on: standards and guidelines of interest to museums; current projects involving standards research and implementation; organizations responsible for standards research and development; Links." ... "CHIN is a member of CIMI (the Consortium for the Interchange of Museum Information), which works to enable the electronic interchange of museum information. From 1998 to 1999, CHIN participated in a CIMI Metadata Testbed which aimed to explore the creation and use of metadata for facilitating the discovery of electronic museum information. Specifically, the project explored the creation and use of Dublin Core metadata in describing museum collections, and examined how Dublin Core could be used as a means to aid in resource discovery within an electronic, networked environment such as the World Wide Web." 
This is one of a series of guides produced by the Cedars digital preservation project. This guide concentrates on the technical approaches that Cedars recommends as a result of its experience. The accent is on preservation, without which continued access is not possible. The time scale is at least decades, i.e. way beyond the lifetime of any hardware technology. The overall preservation strategy is to remove the data from its medium of acquisition and to preserve the digital content as a stream of bytes. There is good reason to be confident that data held as a stream of bytes can be preserved indefinitely. Just as there is no access without preservation, preservation with no prospect of future access is a very sterile exercise. As well as preserving the data as a byte-stream, Cedars adds in metadata. This includes reference to facilities (called technical metadata in this document) for accessing the intellectual content of the preserved data. This technical metadata will usually include actual software for use in accessing the data. It will be stored as a preserved object in the overall archive store, and will be revised as technology evolves making new methods of access to preserved objects appropriate. There will be big economies of scale, as most, if not all, objects of the same type will share the same technical metadata. Cedars recommends against repeated format conversions, and instead argues for keeping the preserved byte-stream, while tracking evolving technology by maintaining the technical metadata. It is for this reason that Cedars includes only a reference to the technical metadata in the preserved data object. Thus future users of the object will be pointed to information appropriate to their own era, rather than that of the object's preservation. The monitoring and updating of this aspect of the technical metadata is a vital function of the digital library. In practice, Cedars expects that very many preserved digital objects will be in the same format, and will reference the same technical metadata. Access to a preserved object then involves Migration on Request, in that any necessary migration from an obsolete format to an appropriate current day format happens at the point of request. As well as recommending actions to be taken to preserve digital objects, Cedars also recommends the use of a permanent naming scheme, with a strong recommendation that such a scheme should be infinitely extensible.
Critical Arguements
CA "This document is intended to inform technical practitioners in the actual preservation of digital materials, and also to highlight to library management the importance of this work as continuing their traditional scholarship role into the 21st century."
This document provides some background on preservation metadata for those interested in digital preservation. It first attempts to explain why preservation metadata is seen as an essential part of most digital preservation strategies. It then gives a broad overview of the functional and information models defined in the Reference Model for an Open Archival Information System (OAIS) and describes the main elements of the Cedars outline preservation metadata specification. The next sections take a brief look at related metadata initiatives, make some recommendations for future work and comment on cost issues. At the end there are some brief recommendations for collecting institutions and the creators of digital content followed by some suggestions for further reading.
Critical Arguements
CA "This document is intended to provide a brief introduction to current preservation metadata developments and introduce the outline metadata specifications produced by the Cedars project. It is aimed in particular at those who may have responsibility for digital preservation in the UK further and higher education community, e.g. senior staff in research libraries and computing services. It should also be useful for those undertaking digital content creation (digitisation) initiatives, although it should be noted that specific guidance on this is available elsewhere. The guide may also be of interest to other kinds of organisations that have an interest in the long-term management of digital resources, e.g. publishers, archivists and records managers, broadcasters, etc. This document aimes to provide: A rationale for the creation and maintenance of preservation metadata to support digital preservation strategies, e.g. migration or emulation; An introduction to the concepts and terminology used in the influential ISO Reference Model for an Open Archival Information System (OAIS); Brief information on the Cedars outline preservation metadata specification and the outcomes of some related metadata initiatives; Some notes on the cost implications of preservation metadata and how these might be reduced.
Conclusions
RQ "In June 2000, a group of archivists, computer scientists and metadata experts met in the Netherlands to discuss metadata developments related to recordkeeping and the long-term preservation of archives. One of the key conclusions made at this working meeting was that the recordkeeping metadata communities should attempt to co-operate more with other metatdata initiatives. The meeting also suggested research into the contexts of creation and use, e.g. identifying factors that might encourage or discourage creators form meeting recordkeeping metadata requirements. This kind of research would also be useful for wider preservation metadata developments. One outcome of this meeting was the setting up of an Archiving Metadata Forum (AMF) to form the focus of future developments." ... "Future work on preservation metadata will need to focus on several key issues. Firstly, there is an urgent need for more practical experience of undertaking digital preservation strategies. Until now, many preservation metadata initiatives have largely been based on theoretical considerations or high-level models like the OAIS. This is not in itself a bad thing, but it is now time to begin to build metadata into the design of working systems that can test the viability of digital preservation strategies in a variety of contexts. This process has already begun in initiatives like the Victorian Electronic Records Stategy and the San Diego Supercomputer Center's 'self-validating knowledge-based archives'. A second need is for increased co-operation between the many metadata initiatives that have an interest in digital preservation. This may include the comparison and harmonisation of various metadata specifications, where this is possible. The OCLC/LG working group is an example of how this has been taken forward whitin a particular domain. There is a need for additional co-operation with recordkeeping metadata specialists, computing scientists and others in the metadata research community. Thirdly, there is a need for more detailed research into how metadata will interact with different formats, preservation strategies and communities of users. This may include some analysis of what metadata could be automatically extracted as part of the ingest process, an investigation of the role of content creators in metadata provision, and the production of user requirements." ... "Also, thought should be given to the development of metadata standards that will permit the easy exchange of preservation metadata (and information packages) between repositories." ... "As well as ensuring that digital repositories are able to facilitate the automatic capture of metadata, some thought should also be given to how best digital repositories could deal with any metadata that might already exist."
SOW
DC "Funded by JISC (the Joint Information Systems Committee of the UK higher education funding councils), as part of its Electronic Libraries (eLib) Programme, Cedars was the only project in the programme to focus on digital preservation." ... "In the digitial library domain, the development of a recommendation on preservation metadata is being co-ordinated by a working group supported by OCLC and the RLG. The membership of the working group is international, and inlcudes key individuals who were involved in the development of the Cedars, NEDLIB and NLA metadata specifications."
Type
Web Page
Title
Practical Tools for Electronic Records Management and Preservation
"This briefing paper summarizes the results of a cooperative project sponsored in part, by a research grant from the National Historical Publications and Records Commission. The project, called "Models for Action: Practical Approaches to Electronic Records Management and Preservation," focused on the development of practical tools to support the integration of essential electronic records management requirements into the design of new information systems. The project was conducted from 1996 to 1998 through a partnership between the New York State Archives and Records Administration and the Center for Technology in Government. The project team also included staff from the NYS Adirondack Park Agency, eight corporate partners led by Intergraph Corporation, and University at Albany faculty and graduate students."
Publisher
Center for Technology in Government
Critical Arguements
CA "This briefing paper bridges the gap between theory and practice by presenting generalizable tools that link records management practices to business objectives."
Type
Web Page
Title
Deliberation No. 11/2004 of 19 February 2004: "Technical Rules for Copying and Preserving Electronic Documents on Digital Media which are Suitable to Guarantee Authentic Copies"
CA Recognizes that preservation of authentic electronic records means preservation of authentic/true copies. Thus the preservation process is called substitute preservation, and the authenticity of a preserved document is not established on the object itself (as it was with traditional media), but through the authority of the preserver (and possibly a notary), who would attest to the identity and integrity of the whole of the reproduced documents every time a migration occurs. The preserver's task list is also noteworthy. Archival units description stands out as an essential activity (not replaceable by the metadata which are associated to each single document) in order to maintain intellectual control over holdings.
SOW
DC CNIPA (Centro Nazionale per l'Informatica nella Pubblica Amministrazione) replaced AIPA (Autorita' per l'Informatica nella Pubblica Amministrazione) in 2003. Such an Authority (established in 1993 according to art. 4 of the Legislative Decree 39/1993, as amended by art. 176 of the Legislative Decree 196/2003) operates as a branch of the Council of Ministers' Presidency with the mandate to put the Ministry for Innovation and Technologies' policies into practice. In particular, CNIPA is responsible for bringing about reforms relevant to PA's modernization, the spread of e-government and the development of nationwide networks to foster better communication among public offices and between citizens and the State. In the Italian juridical system, CNIPA's deliberations have a lower enabling power, but they nevertheless are part of the State's body of laws. The technical rules provided in CNIPA's deliberation 11/2004 derive from art. 6, par. 2 of the DPR 445/2000, which says: "Preservation obligations are fully satisfied, both for administrative and probative purposes, also with the use of digital media when the employed procedures comply with the technical rules provided by AIPA." In order to keep those rules up to date according to the latest technology, AIPA's deliberation no. 42 of 13 December 2001 on "Technical rules for documents reproduction and preservation on digital media that are suitable to guarantee true copies of the original documents" has been replaced by the current CNIPA deliberation.
The CDISC Submission Metadata Model was created to help ensure that the supporting metadata for these submission datasets should meet the following objectives: Provide FDA reviewers with clear describtions of the usage, structure, contents, and attributes of all datasets and variables; Allow reviewers to replicate most analyses, tables, graphs, and listings with minimal or no transformations; Enable reviewers to easily view and subset the data used to generate any analysis, table, graph, or listing without complex programming. ... The CDISC Submission Metadata Model has been defined to guide sponsors in the preparation of data that is to be submitted to the FDA. By following the principles of this model, sponsors will help reviewers to accurately interpret the contents of submitted data and work with it more effectively, without sacrificing the scientific objectives of clinical development.
Publisher
The Clinical Data Interchange Standards Consortium
Critical Arguements
CA "The CDISC Submission Data Model has focused on the use of effective metadata as the most practical way of establishing meaningful standards applicable to electronic data submitted for FDA review."
Conclusions
RQ "Metadata prepared for a domain (such as an efficacy domain) which has not been described in a CDISC model should follow the general format of the safety domains, including the same set of core selection variables and all of the metadata attributes specified for the safety domains. Additional examples and usage guidelines are available on the CDISC web site at www.cdisc.org." ... "The CDISC Metadata Model describes the structure and form of data, not the content. However, the varying nature of clinical data in general will require the sponsor to make some decisions about how to represent certain real-world conditions in the dataset. Therefore, it is useful for a metadata document to give the reviewer an indication of how the datasets handle certain special cases."
SOW
DC CDISC is an open, multidisciplinary, non-profit organization committed to the development of worldwide standards to support the electronic acquisition, exchange, submission and archiving of clinical trials data and metadata for medical and biopharmaceutical product development. CDISC members work together to establish universally accepted data standards in the pharmaceutical, biotechnology and device industries, as well as in regulatory agencies worldwide. CDISC currently has more than 90 members, including the majority of the major global pharmaceutical companies.
Type
Web Page
Title
CDISC Achieves Two Significant Milestones in the Development of Models for Data Interchange
CA "The Clinical Data Interchange Standards Consortium has achieved two significant milestones towards its goal of standard data models to streamline drug development and regulatory review processes. CDISC participants have completed metadata models for the 12 safety domains listed in the FDA Guidance regarding Electronic Submissions and have produced a revised XML-based data model to support data acquisition and archive."
Conclusions
RQ "The goal of the CDISC XML Document Type Definition (DTD) Version 1.0 is to make available a first release of the definition of this CDISC model, in order to support sponsors, vendors and CROs in the design of systems and processes around a standard interchange format."
SOW
DC "This team, under the leadership of Wayne Kubick of Lincoln Technologies, and Dave Christiansen of Genentech, presented their metadata models to a group of representatives at the FDA on Oct. 10, and discussed future cooperative efforts with Agency reviewers."... "CDISC is a non-profit organization with a mission to lead the development of standard, vendor-neutral, platform-independent data models that improve process efficiency while supporting the scientific nature of clinical research in the biopharmaceutical and healthcare industries"
This document outlines the best practices guidelines for creation of EAD-encoded finding aids for submission to the Archives Hub in the U.K. It includes sections on Mandatory Fields, Access Points, Manual Encoding, Multilevel Descriptions, Saving and Submitting Files, and Links.
Notes
This is a downloadable .pdf file. Also available in Rich Text Format (.rtf).
Publisher
Archives Hub, U.K.
Publication Location
Manchester, England, U.K.
Language
English
Critical Arguements
CA "These pages have been designed to hold links and information which we hope will be useful to archivists and librarians working in the UK Higher and Further Education sectors."
Conclusions
RQ
SOW
DC The Archives Hub provides a single point of access to 17,598 descriptions of archives held in UK universities and colleges. At present these are primarily at collection-level, although complete catalogue descriptions are provided where they are available. The Archives Hub forms one part of the UK's National Archives Network, alongside related networking projects. A Steering Committee which includes representatives of contributing institutions, the National Archives and the other archive networks guides the progress of the project. There is also a Contributors' and Users' Forum which provides feedback to aid the development of the service. The service is hosted at MIMAS on behalf of the Consortium of University Research Libraries (CURL) and is funded by the Joint Information Systems Committee (JISC). Systems development work is undertaken at the University of Liverpool.
Type
Web Page
Title
eXtensible rights Markup Language (XrML) 2.0 Specification Part I: Primer
This specification defines the eXtensible rights Markup Language (XrML), a general-purpose language in XML used to describe the rights and conditions for using digital resources.
Publisher
ContentGuard
Critical Arguements
CA This chapter provides an overview of XrML. It provides a basic definition of XrML, describes the need that XrML is meant to address, and explains design goals for the language.
Conclusions
RQ not applicable
SOW
DC ContentGuard contributed XrML to MPEG-21, the OASIS Rights Language Technical Committee and the Open eBook Forum (OeBF). In each case they are using XrML as the base for their rights language specification. Furthest along is MPEG, where the process has reached Committee Draft. They have also recommended to other standards bodies to build on this work. ContentGuard will propose XrML to any standards organization seeking a rights language. Because of this progress ContentGuard has frozen its release of XrML at Version 2.0.
CA ContentGuard intends to submit XrML to standards bodies that are developing specifications that enable the exchange and trading of content as well as the creation of repositories for storage and management of digital content.
SOW
DC ContentGuard contributed XrML to MPEG-21, the OASIS Rights Language Technical Committee and the Open eBook Forum (OeBF). In each case they are using XrML as the base for their rights language specification. Furthest along is MPEG, where the process has reached Committee Draft. They have also recommended to other standards bodies to build on this work. ContentGuard will propose XrML to any standards organization seeking a rights language. Because of this progress ContentGuard has frozen its release of XrML at Version 2.0.
Type
Web Page
Title
Model Requirements For The Management Of Electronic Records
CA "The MoReq specification is ... primarily intended to serve as a practical tool in helping organisations meet their business needs for the management of both computer-based and paper-based records. While its development has taken traditional archival science and records management disciplines into account, these have been interpreted in a manner appropriate to electronic environments. ... Examples of this pragmatic approach include the incorporation of requirements for document management, workflow, metadata and other related technologies. ... This specification attempts to cover a wide range of requirements -- for different countries, in different industries and with different types of records. The wide scope is intentional, but it leads to a significant limitation, namely that this single specification cannot represent a requirement which precisely maps onto existing requirements without modification."
Type
Web Page
Title
PBCore: Public Broadcasting Metadata Dictionary Project
CA "PBCore is designed to provide -- for television, radio and Web activities -- a standard way of describing and using media (video, audio, text, images, rich interactive learning objects). It allows content to be more easily retrieved and shared among colleagues, software systems, institutions, community and production partners, private citizens, and educators. It can also be used as a guide for the onset of an archival or asset management process at an individual station or institution. ... The Public Broadcasting Metadata Dictionary (PBCore) is: a core set of terms and descriptors (elements) used to create information (metadata) that categorizes or describes media items (sometimes called assets or resources)."
Conclusions
<RQ> The PBCore Metadata Elements are currently in their first published edition, Version 1.0. Over two years of research and lively discussions have generated this version. ... As various users and communities begin to implement the PBCore, updates and refinements to the PBCore are likely to occur. Any changes will be clearly identified, ramifications outlined, and published to our constituents.
SOW
DC "Initial development funding for PBCore was provided by the Corporation for Public Broadcasting. The PBCore is built on the foundation of the Dublin Core (ISO 15836) ... and has been reviewed by the Dublin Core Metadata Initiative Usage Board. ... PBCore was successfully deployed in a number of test implementations in May 2004 in coordination with WGBH, Minnesota Public Radio, PBS, National Public Radio, Kentucky Educational Television, and recognized metadata expert Grace Agnew. As of July 2004 in response to consistent feedback to make metadata standards easy to use, the number of metadata elements was reduced to 48 from the original set of 58 developed by the Metadata Dictionary Team. Also, efforts are ongoing to provide more focused metadata examples that are specific to TV and radio. ... Available free of charge to public broadcasting stations, distributors, vendors, and partners, version 1.0 of PBCore was launched in the first quarter of 2005. See our Licensing Agreement via the Creative Commons for further information. ... Plans are under way to designate an Authority/Maintenance Organization."
Type
Web Page
Title
Schema Registry: activityreports: Recordkeeping Metadata Standard for Commonwealth Agencies
CA "The Australian SPIRT Recordkeeping Metadata Project was initially a project funded under a programme known as the Strategic Partnership with Industry -- Research and Training (SPIRT) Support Grant -- partly funded by the Australian Research Council. The project was concerned with developing a framework for standardising and defining recordkeeping metadata and produced a metadata element set eventually known as the Australian Recordkeeping Metadata Schema (RKMS). The conceptual frame of reference in the project was based in Australian archival practice, including the Records Continuum Model and the Australian Series System. The RKMS also inherits part of the Australian Government Locator Service (AGLS) metadata set."
The creation and use of metadata is likely to become an important part of all digital preservation strategies whether they are based on hardware and software conservation, emulation or migration. The UK Cedars project aims to promote awareness of the importance of digital preservation, to produce strategic frameworks for digital collection management policies and to promote methods appropriate for long-term preservation - including the creation of appropriate metadata. Preservation metadata is a specialised form of administrative metadata that can be used as a means of storing the technical information that supports the preservation of digital objects. In addition, it can be used to record migration and emulation strategies, to help ensure authenticity, to note rights management and collection management data and also will need to interact with resource discovery metadata. The Cedars project is attempting to investigate some of these issues and will provide some demonstrator systems to test them.
Notes
This article was presented at the Joint RLG and NPO Preservation Conference: Guidelines for Digital Imaging, held September 28-30, 1998.
Critical Arguements
CA "Cedars is a project that aims to address strategic, methodological and practical issues relating to digital preservation (Day 1998a). A key outcome of the project will be to improve awareness of digital preservation issues, especially within the UK higher education sector. Attempts will be made to identify and disseminate: Strategies for collection management ; Strategies for long-term preservation. These strategies will need to be appropriate to a variety of resources in library collections. The project will also include the development of demonstrators to test the technical and organisational feasibility of the chosen preservation strategies. One strand of this work relates to the identification of preservation metadata and a metadata implementation that can be tested in the demonstrators." ... "The Cedars Access Issues Working Group has produced a preliminary study of preservation metadata and the issues that surround it (Day 1998b). This study describes some digital preservation initiatives and models with relation to the Cedars project and will be used as a basis for the development of a preservation metadata implementation in the project. The remainder of this paper will describe some of the metadata approaches found in these initiatives."
Conclusions
RQ "The Cedars project is interested in helping to develop suitable collection management policies for research libraries." ... "The definition and implementation of preservation metadata systems is going to be an important part of the work of custodial organisations in the digital environment."
SOW
DC "The Cedars (CURL exemplars in digital archives) project is funded by the Joint Information Systems Committee (JISC) of the UK higher education funding councils under Phase III of its Electronic Libraries (eLib) Programme. The project is administered through the Consortium of University Research Libraries (CURL) with lead sites based at the Universities of Cambridge, Leeds and Oxford."
Type
Web Page
Title
Metadata for preservation : CEDARS project document AIW01
This report is a review of metadata formats and initiatives in the specific area of digital preservation. It supplements the DESIRE Review of metadata (Dempsey et al. 1997). It is based on a literature review and information picked-up at a number of workshops and meetings and is an attempt to briefly describe the state of the art in the area of metadata for digital preservation.
Critical Arguements
CA "The projects, initiatives and formats reviewed in this report show that much work remains to be done. . . . The adoption of persistent and unique identifiers is vital, both in the CEDARS project and outside. Many of these initiatives mention "wrappers", "containers" and "frameworks". Some thought should be given to how metadata should be integrated with data content in CEDARS. Authenticity (or intellectual preservation) is going to be important. It will be interesting to investigate whether some archivists' concerns with custody or "distributed custody" will have relevance to CEDARS."
Conclusions
RQ Which standards and initiatives described in this document have proved viable preservation metadata models?
SOW
DC OAIS emerged out of an initiative spearheaded by NASA's Consultative Committee for Space Data Systems. It has been shaped and promoted by the RLG and OCLC. Several international projects have played key roles in shaping the OAIS model and adapting it for use in libraries, archives and research repositories. OAIS-modeled repositories include the CEDARS Project, Harvard's Digital Repository, Koninklijke Bibliotheek (KB), the Library of Congress' Archival Information Package for audiovisual materials, MIT's D-Space, OCLC's Digital Archive and TERM: the Texas Email Repository Model.
Type
Web Page
Title
Approaches towards the Long Term Preservation of Archival Digital Records
The Digital Preservation Testbed is carrying out experiments according to pre-defined research questions to establish the best preservation approach or combination of approaches. The Testbed will be focusing its attention on three different digital preservation approaches - Migration; Emulation; and XML - evaluating the effectiveness of these approaches, their limitations, costs, risks, uses, and resource requirements.
Language
English; Dutch
Critical Arguements
CA "The main problem surrounding the preservation of authentic electronic records is that of technology obsolescence. As changes in technology continue to increase exponentially, the problem arises of what to do with records that were created using old and now obsolete hardware and software. Unless action is taken now, there is no guarantee that the current computing environment (and thus also records) will be accessible and readable by future computing environments."
Conclusions
RQ "The Testbed will be conducting research to discover if there is an inviolable way to associate metadata with records and to assess the limitations such an approach may incur. We are also working on the provision of a proposed set of preservation metadata that will contain information about the preservation approach taken and any specific authenticity requirements."
SOW
DC The Digital Preservation Testbed is part of the non-profit organisation ICTU. ICTU is the Dutch organisation for ICT and government. ICTU's goal is to contribute to the structural development of e-government. This will result in improving the work processes of government organisations, their service to the community and interaction with the citizens. Government institutions, such as Ministries, design the policies in the area of e-government, and ICTU translates these policies into projects. In many cases, more than one institution is involved in a single project. They are the principals in the projects and retain control concerning the focus of the project. In case of the Digital Preservation Testbed the principals are the Ministry of the Interior and the Dutch National Archives.
This paper discusses how metadata standards can help organizations comply with the ISO 9000 standards for quality systems. It provides a brief overview of metadata, ISO 9000 and related records management standards. It then analyses in some depth the ISO 9000 requirements for quality records, and outlines the problems that some organizations have in complying with them. It also describes the metadata specifications developed by the University of Pittsburgh Electronic Recordkeeping project and the SPIRT Recordkeeping Metadata project in Australia and discusses the role of metadata in meeting ISO 9000 requirements for the creation and preservation of reliable, authentic and accessible records.
Publisher
Records Continuum Research Group
Critical Arguements
CA "During the last few years a number of research projects have studied the types of metadata needed to create, manage and make accessible quality records, i.e. reliable, authentic and useable records. This paper will briefly discuss the purposes of recordkeeping metadata, with reference to emerging records management standards, and the models presented by two projects, one in the United States and one in Australia. It will also briefly review the ISO 9000 requirements for records and illustrate how metadata can help an organization meet these requirements."
Conclusions
RQ "Quality records provide many advantages for organizations and can help companies meet the ISO 9000 certification. However, systems must be designed to create the appropriate metadata to ensure they comply with recordkeeping requirements, particularly those identified by records management standards like AS 4390 and the proposed international standard, which provide benchmarks for recordkeeping best practice. The Pittsburgh metadata model and the SPIRT framework provide organizations with standardized sets of metadata that would ensure the creation, preservation and accessibility of reliable, authentic and meaningful records for as long as they are of use. In deciding what metadata to capture, organisations should consider the cost of meeting the requirements of the ISO 9000 guidelines and any related records management best practice standards, and the possible risk of not meeting these requirements."
Type
Web Page
Title
The Gateway to Educational Materials: An Evaluation Study, Year 4: A Technical Report submitted to the US Department of Education
CA The Gateway to Educational Materials (GEM) is a Web site created through the efforts of several groups, including the US Department of Education, The National Library of Education, and a team from Syracuse University. The goal of the project is to provide teachers with a broad range of educational materials on the World Wide Web. This study evaluates The Gateway as an online source of educational information. The purpose of this evaluation is to provide developers of The Gateway with information about aspects of the system that might need improvement, and to display lessons learned through this process to developers of similar systems. It is the fourth in a series of annual studies, and focuses on effectiveness of The Gateway from the perspectives of end users and collection holders.
CA In March 2003, the intention of undertaking an international survey of LOM implementations was announced at the plenary meeting of the "Information Technology for Learning, Education and Training", ISO/IEC JTC1/SC36 sub-committee. The ISO/IEC JTC1/SC36 committee is international in both membership and emphasis, and has a working group, Working Group (WG) 4, "Management and Delivery for Learning, Education, and Training," which has been explicitly charged with the task of contributing to future standardization work on the LOM. <warrant> The international LOM Survey focuses on two questions: 1) "Which elements were selected for use or population?"; and 2) "How were these elements used, or what where the types of values assigned to them?" This report also attempts to draw a number of tentative suggestions and conclusions for further standardization work
Conclusions
RQ Based on its findings, the preliminary survey report was able to suggest a number of conclusions: First, fewer and better-defined elements may be more effective than the range of choice and interpretive possibilities currently allowed by the LOM. This seems to be especially the case regarding educational elements, which are surprisingly underutilized for metadata that it ostensibly and primarily educational. Second, clear and easily-supported means of working with local, customized vocabularies would also be very valuable. Third, it also seems useful to ensure that structures are provided to accommodate complex but more conventional aspects of resource description. These would include multiple title versions, as well as multilingual descriptions and values.
SOW
DC On June 12, 2002, 1484.12.1 - 2002 Learning Object Metadata (LOM) was approved by the IEEE-Standards Association.
Type
Web Page
Title
Towards a Digital Rights Expression Language Standard for Learning Technology
CA The Learning Technology Standards Committee (LTSC) of the Institute for Electrical and Electronic Engineers (IEEE) concentrated on making recommendations for standardizing a digital rights expression language (DREL) with the specific charge to (1) Investigate existing standards development efforts for DREL and digital rights. (2) Gather DREL requirements germane to the learning, education, and training industries. (3) Make recommendations as to how to proceed. (4) Feed requirements into ongoing DREL and digital rights standardization efforts, regardless of whether the LTSC decides to work with these efforts or embark on its own. This report represents the achievement of these goals in the form a of a white paper that can be used as reference for the LTSC, that reports on the current state of existing and proposed standardization efforts targeting digital rights expression languages and makes recommendations concerning future work.
Conclusions
RQ The recommendations of this report are: 1. Maintain appropriate liaisons between learning technology standards development organizations and those standards development organizations standardizing rights expression languages. The purpose of these liaisons is to continue to feed requirements into broader standardization efforts and to ensure that the voice of the learning, education and training community is heard. 2. Support the creation of application profiles or extensions of XrML and ODRL that include categories and vocabularies for roles common in educational and training settings. In the case of XrML, a name space for local context may be needed. (A name space is required for both XrML and ODRL for the ÔÇ£application profileÔÇØ or specifically the application ÔÇôLT application- extension) 3. Advocate the creation of a standard for expressing local policies in ways that can be mapped to rights expressions. This could be either through a data model or through the definition of an API or service. 4. Launch an initiative to identify models of rights enforcement in learning technology and to possibly abstract a common model for use by architecture and framework definition projects. 5. Further study the implications of patent claims, especially for educational and research purposes.
Type
Web Page
Title
METS : Metadata Encoding and Transmission Standard
CA "METS, although in its early stages, is already sufficiently established amongst key digital library players that it can reasonably be considered the only viable standard for digital library objects in the foreseeable future. Although METS may be an excellent framework, it is just that and only that. It does not prescribe the content of the metadata itself, and this is a continuing problem for METS and all other schema to contend with if they are to realize their full functionality and usefulness."
Conclusions
RQ The standardization (via some sort of cataloging rules) of the content held by metadata "containers" urgently needs to be addressed. If not, the full value of any metadata scheme, no matter how extensible or robust, will not be realized.
Type
Web Page
Title
Appendix N to Proceedings of The Uniform Law Conference of Canada, Proposals for a Uniform Electronic Evidence Act
CA "First, there is a great deal of uncertainty about how the [Canada Evidence Act], particularly s. 30(6), will be applied, and this makes it difficult for the parties to prepare for litigation and for businesses to know how they should keep their records. Second, there are risks to the integrity of records kept on a computer that do not exist with respect to other forms of information processing and storage, and if alterations are made, either negligently or deliberately, they can be extremely difficult to detect. Third, s. 30(1) provides little assurance that the record produced to the court is the same as the one that was originally made in the usual and ordinary course of business, for while self-interest may be an adequate guarantee that most businesses will maintain accurate and truthful records, it is not true for many others. The second and third problems combined place the party opposing the introduction of computer-produced business records in a difficult situation."
SOW
DC The Uniform Law Conference of Canada undertook to adopt uniform legislation to ensure that computer records could be used appropriately in court.
Type
Web Page
Title
National States Geographic Information Council (NSGIC) Metadata Primer -- A "How To" Guide on Metadata Implementation
The primer begins with a discussion of what metadata is and why metadata is important. This is followed by an overview of the Content Standards for Digital Geospatial Metadata (CSDGM) adopted by the Federal Geographic Data Committee (FGDC). Next, the primer focuses on the steps required to begin collecting and using metadata. The fourth section deals with how to select the proper metadata creation tool from the growing number being developed. Section five discusses the mechanics of documenting a data set, including strategies on reviewing the output to make sure it is in a useable form. The primer concludes with a discussion of other assorted metadata issues.
Critical Arguements
CA The Metadata Primer is one phase of a larger metadata research and education project undertaken by the National States Geographic Information Council and funded by the Federal Geographic Data Committee's Competetive Cooperative Agreements Program (CCAP). The primer is designed to provide a practical overview of the issues associated with developing and maintaining metadata for digital spatial data. It is targeted toward an audience of state, local, and tribal government personnel. The document provides a "cook book" approach to the creation of metadata. Because much of the most current information on metadata resides on the Internet, the primer summarizes relevant material available from other World Wide Web (WWW) home pages.
Conclusions
RQ To what extent could the NSGIC recommendations be used for non-geographic applications?
SOW
DC FGDC approved the Content Standard for Digital Geospatial Metadata (FGDC-STD-001-1998) in June 1998. FGDC is a 19-member interagency committee composed of representatives from the Executive Office of the President, Cabinet-level and independent agencies. The FGDC is developing the National Spatial Data Infrastructure (NSDI) in cooperation with organizations from State, local and tribal governments, the academic community, and the private sector. The NSDI encompasses policies, standards, and procedures for organizations to cooperatively produce and share geographic data.
Abstract The ability of investigators to share data is essential to the progress of integrative scientific research both within and across disciplines. This paper describes the main issues in achieving effective data sharing based on previous efforts in building scientific data networks and, particularly, recent efforts within the Earth sciences. This is presented in the context of a range of information architectures for effecting differing levels of standardization and centralization both from a technology perspective as well as a publishing protocol perspective. We propose a new Metadata Interchange Format (.mif) that can be used for more effective sharing of data and metadata across digital libraries, data archives and research projects.
Critical Arguements
CA "In this paper, we discuss two important information technology aspects of the electronic publication of data in the Earth sciences, metadata, and a variety of different concepts of electronic data publication. Metadata are the foundation of electronic data publications and they are determined by needs of archiving, the scientific analysis and reproducibility of a data set, and the interoperability of diverse data publication methods. We use metadata examples drawn from the companion paper by Staudigel et al. (this issue) to illustrate the issues involved in scaling-up the publication of data and metadata by individual scientists, disciplinary groups, the Earth science community-at-large and to libraries in general. We begin by reviewing current practices and considering a generalized alternative." ... 'For this reason, we will we first discuss different methods of data publishing via a scientific data network followed by an inventory of desirable characteristics of such a network. Then, we will introduce a method for generating a highly portable metadata interchange format we call .mif (pronounced dot-mif) and conclude with a discussion of how this metadata format can be scaled to support the diversity of interests within the Earth science community and other scientific communities." ... "We can borrow from the library community the methods by which to search for the existence and location of data (e.g., Dublin Core http://www.dublincore.org) but we must invent new ways to document the metadata needed within the Earth sciences and to comply with other metadata standards such as the Federal Geographic Data Committee (FGDC). To accomplish this, we propose a metadata interchange format that we call .mif that enables interoperability and an open architecture that is maximally independent of computer systems, data management approaches, proprietary software and file formats, while encouraging local autonomy and community cooperation. "
Conclusions
RQ "These scalable techniques are being used in the development of a project we call SIOExplorer that can found at http://sioexplorer.ucsd.edu although we have not discussed that project in any detail. The most recent contributions to this discussion and .mif applications and examples may be found at http:\\Earthref.org\metadata\GERM\."
SOW
DC This article was written by representatives of the San Diego Supercomputer Center and the Insititute of Geophysics and Planetary Physics under the auspices of the University of California, San Diego.
Type
Web Page
Title
Softening the borderlines of archives through XML - a case study
Archives have always had troubles getting metadata in formats they can process. With XML, these problems are lessening. Many applications today provide the option of exporting data into an application-defined XML format that can easily be post-processed using XSLT, schema mappers, etc, to fit the archives┬┤ needs. This paper highlights two practical examples for the use of XML in the Swiss Federal Archives and discusses advantages and disadvantages of XML in these examples. The first use of XML is the import of existing metadata describing debates at the Swiss parliament whereas the second concerns preservation of metadata in the archiving of relational databases. We have found that the use of XML for metadata encoding is beneficial for the archives, especially for its ease of editing, built-in validation and ease of transformation.
Notes
The Swiss Federal Archives defines the norms and basis of records management and advises departments of the Federal Administration on their implementation. http://www.bar.admin.ch/bar/engine/ShowPage?pageName=ueberlieferung_aktenfuehrung.jsp
Critical Arguements
CA "This paper briefly discusses possible uses of XML in an archival context and the policies of the Swiss Federal Archives concerning this use (Section 2), provides a rough overview of the applications we have that use XML (Section 3) and the experiences we made (Section 4)."
Conclusions
RQ "The systems described above are now just being deployed into real world use, so the experiences presented here are drawn from the development process and preliminary testing. No hard facts in testing the sustainability of XML could be gathered, as the test is time itself. This test will be passed when we can still access the data stored today, including all metadata, in ten or twenty years." ... "The main problem area with our applications was the encoding of the XML documents and the non-standard XML document generation of some applications. When dealing with the different encodings (UTF-8, UTF-16, ISO-8859-1, etc) some applications purported a different encoding in the header of the XML document than the true encoding of the document. These errors were quickly identified, as no application was able to read the documents."
SOW
DC The author is currently a private digital archives consultant, but at the time of this article, was a data architect for the Swiss Federal Archives. The content of this article owes much to the work being done by a team of architects and engineers at the Archives, who are working on an e-government project called ARELDA (Archiving of Electronic Data and Records).
Type
Web Page
Title
The Making and the Keeping of Records: (2) The Tyranny of Listing
CA Listing is tantamount to traditional recordkeeping methodology. This paradigm needs to be reconsidered to allow for better-designed archival systems.
Conclusions
RQ Can we ultimately abandon the traditional concern of ensuring records' persistence and still keep records?
CA Overview of the program, including keynote speakers, papers presented, invited talks, future directions and next steps.
Conclusions
RQ Some steps to be taken: (1) Investigate potential move to a formal standards body/group and adopt their procedures and processes. Potential groups include; W3C, OASIS, ECMA, IEEE, IETF, CEN/ISS, Open Group. The advantages and disadvantages of such a move will be documented and discussed within the ODRL community. (2) Potential to submit current ODRL version to national bodies for adoption. (3) Request formal liaison relationship with the OMA. <warrant>
This report focuses on the development of tools for the description and intellectual control of archives and the discovery of relevant resources by users. Other archival functions, such as appraisal, acquisition, preservation, and physical control, are beyond the scope for this project. The system developed as a result of this report should be useable on stand-alone computers in small institutions, by multiple users in larger organisations, and by local, regional, national, and international networks. The development of such a system should take into account the strategies, experiences, and results of other initiatives such as the European Union Archival Network (EUAN), the Linking and Exploring Authority Files (LEAF) initiative, the European Visual Archives (EVA) project, and the Canadian Archival Information Network (CAIN). This report is divided into five sections. A description of the conceptual structure of an archival information system, described as six layers of services and protocols, follows this introduction. Section three details the functional requirements for the software tool and is followed by a discussion of the relationship of these requirements to existing archival software application. The report concludes with a series of recommendations that provide a strategy for the successful development, deployment, and maintenance of an Open Source Archival Resource Information System (OSARIS). There are two appendices: a data model and a comparison of the functional requirements statements to several existing archival systems.
Notes
3. Functional Requirements Requirements for Information Interchange 3.2: The system must support the current archival standards for machine-readable data communication, Encoded Archival Description (EAD) and Encoded Archival Context (EAC). A subset of elements found in EAD may be used to exchange descriptions based on ISAD(G) while elements in EAC may be used to exchange ISAAR(CPF)-based authority data.
Publisher
International Council on Archives Committee on Descriptive Standards
Critical Arguements
CA The Ad Hoc Committee agrees that it would be highly desirable to develop a modular, open source software tool that could be used by archives worldwide to manage the intellectual control of their holdings through the recording of standardized descriptive data. Individual archives could combine their data with that of other institutions in regional, national or international networks. Researchers could access this data either via a stand-alone computerized system or over the Internet. The model for this software would be the successful UNESCO-sponsored free library program, ISIS, which has been in widespread use around the developing world for many years. The software, with appropriate supporting documentation, would be freely available via an ICA or UNESCO web site or on CD-ROM. Unlike ISIS, however, the source code and not just the software should be freely available.
Conclusions
RQ "1. That the ICA endorses the functional requirements presented in this document as the basis for moving the initiative forward. 2. That the functional desiderata and technical specifications for the software applications, such as user requirements, business rules, and detailed data models, should be developed further by a team of experts from both ICA/CDS and ICA/ITC as the next stage of this project. 3. That following the finalization of the technical specifications for OSARIS, the requirements should be compared to existing systems and a decision made to adopt or adapt existing software or to build new applications. At that point in time, it will then be possible to estimate project costs. 4. That a solution that incorporates the functional requirements result in the development of several modular software applications. 5. That the implementation of the system should follow a modular strategy. 6. That the development of software applications must include a thorough investigation and assessment of existing solutions beginning with those identified in section four and Appendix B of this document. 7. That the ICA develop a strategy for communicating the progress of this project to members of the international archival community on a regular basis. This would include the distribution of progress reports in multiple languages. The communication strategy must include a two-way exchange of ideas. The project will benefit strongly from the ongoing comments, suggestions, and input of the members of the international archival community. 8. That a test-bed be developed to allow the testing of software solutions in a realistic archival environment. 9. That the system specifications, its documentation, and the source codes for the applications be freely available. 10. That training courses for new users, ongoing education, and webbased support groups be established. 11. That promotion of the software be carried out through the existing regional infrastructure of ICA and through UNESCO. 12. That an infrastructure for ongoing maintenance, distribution, and technical support be developed. This should include a web site to download software and supporting documentation. The ICA should also establish and maintain a mechanism for end-users to recommend changes and enhancements to the software. 13. That the ICA establishes and maintains an official mechanism for regular review of the software by an advisory committee that includes technical and archival experts. "
SOW
DC "The development of such a system should take into account the strategies, experiences, and results of other initiatives such as the European Union Archival Network (EUAN), the Linking and Exploring Authority Files (LEAF) initiative, the European Visual Archives (EVA) project, and the Canadian Archival Information Network (CAIN)."
Just like other memory institutions, libraries will have to play an important part in the Semantic Web. In that context, ontologies and conceptual models in the field of cultural heritage information are crucial, and the interoperability between these ontologies and models perhaps even more crucial. This document reviews four projects and models that the FRBR Review Group recommends for consideration as to interoperability with FRBR.
Publisher
International Federation of Library Associations and Institutions
Critical Arguements
CA "Just like other memory institutions, libraries will have to play an important part in the Semantic Web. In that context, ontologies and conceptual models in the field of cultural heritage information are crucial, and the interoperability between these ontologies and models perhaps even more crucial."
Conclusions
RQ 
SOW
DC "Some members of the CRM-SIG, including Martin Doerr himself, also are subscribers to the FRBR listserv, and Patrick Le Boeuf, chair of the FRBR Review Group, also is a member of the CRM-SIG and ISO TC46/SC4/WG9 (the ISO Group on CRM). A FRBR to CRM mapping is available from the CIDOC CRM-SIG listserv archive." ... This report was produced by the Cataloguing Section of IFLA, the International Federation of Library Associations and Institutions. 
This document is a draft version 1.0 of requirements for a metadata framework to be used by the International Press Telecommunications Council for all new and revised IPTC standards. It was worked on and agreed to by members of the IPTC Standards Committee, who represented a variety of newspaper, wire agencies, and other interested members of the IPTC.
Notes
Misha Wolf is also listed as author.
Publisher
International Press Telecommunications Council (IPTC)
Critical Arguements
CA "This Requirements document forms part of the programme of work called ITPC Roadmap 2005. The Specification resulting from these Requirements will define the use of metadata by all new IPTC standards and by new major versions of existing IPTC standards." (p. 1) ... "The purpose of the News Metadata Framework (NMDF) WG is to specify how metadata will be expressed, referenced, and managed in all new major versions of IPTC standards. The NMF WG will: Gather, discuss, agree and document functional requirements for the ways in which metadata will be expressed, referenced and managed in all new major versions of IPTC standards; Discuss, agree and document a model, satisfying these requirements; Discuss, agree and document possible approaches to expressing this model in XML, and select those most suited to the tasks. In doing so, the NMDF WG will, where possible, make use of the work of other standards bodies. (p. 2)
Conclusions
RQ "Open issues include: The versioning of schemes, including major and minor versions, and backward compatibility; the versioning of TopicItems; The design of URIs for TopicItem schemes and TopicItem collections, including the issues of: versions (relating to TopicItems, schemes, and collections); representations (relating to TopicItems and collections); The relationship between a [scheme, code] pair, the corresponding URI and the scheme URI." (p. 17)
SOW
DC The development of this framework came out of the 2003 News Standards Summit, which was attended by representatives from over 80 international press and information agencies ... "The News Standards Summit brings together major players--experts on news metadata standards as well as commercial news providers, users, and aggregators. Together, they will analyze the current state and future expectations for news and publishing XML and metadata efforts from both the content and processing model perspectives. The goal is to increase understanding and to drive practical, productive convergence." ... This is a draft version of the standard.
Type
Web Page
Title
Kansas Electronic Recordkeeping Strategy: A White Paper
The Library of Congress EAD Practices Working Group has drafted these proposed guidelines for the creation of EAD finding aids at the Library of Congress, a process which has included documenting current practices at the Library, examining other documented standards and practices, and addressing outstanding issues.  
Publisher
Library of Congress
Language
English
Critical Arguements
<CA>These guidelines are intended for use in conjunction with the EAD Tag Library Version 1.0 and EAD Application Guidelines, published by the Society of American Archivists and the Library of Congress and available online at http://www.loc.gov/ead/.
Conclusions
RQ
SOW
DC "The guidelines were made available to the Library of Congress EAD Technical Group for review, and many suggestions for improvement have been incorporated into this final draft which is now available for use by Library staff."
CA The metadata necessary for successful management and use of digital objects is both more extensive than and different from the metadata used for managing collections of printed works and other physical materials. Without structural metadata, the page image or text files comprising the digital work are of little use, and without technical metadata regarding the digitization process, scholars may be unsure of how accurate a reflection of the original the digital version provides. For internal management purposes, a library must have access to appropriate technical metadata in order to periodically refresh and migrate the data, ensuring the durability of valuable resources.
SOW
DC OAIS emerged out of an initiative spearheaded by NASA's Consultative Committee for Space Data Systems. It has been shaped and promoted by the RLG and OCLC. Several international projects have played key roles in shaping the OAIS model and adapting it for use in libraries, archives and research repositories. OAIS-modeled repositories include the CEDARS Project, Harvard's Digital Repository, Koninklijke Bibliotheek (KB), the Library of Congress' Archival Information Package for audiovisual materials, MIT's D-Space, OCLC's Digital Archive and TERM: the Texas Email Repository Model.
CA The role of archives and archivists is being fundamentally redefined in consideration of postcustodial theories and practice.
Conclusions
RQ Who is accountable? How explicit should the "imprint" of the archivist be in the shaping of the record? Who decides (and how) what we remember and what we keep?
CA There is great potential in developing a national standard for the control of records that combines traditional recordkeeping practices with continuum-based thinking and cutting-edge metadata.
Conclusions
RQ One challenge is integrating item-level metadata with system-level metadata. Linking old and new archival descriptive systems should be done as seamlessly as possible, since retrofitting would be too expensive. Another important area is linking contextual metadata to records whenever they are used outside their domain in order to provide "external validation" (p.17) <warrant>
Type
Web Page
Title
Metadata Reference Guide: ONIX ONline Information eXchange
CA According to Editeur, the group responsible for the maintenance of the ONIX standard, ONIX is the international standard for representing book, serial, and video product information in electronic form.
Type
Web Page
Title
NHPRC: Minnesota State Archives Strategic Plan: Electronic Records Consultant Project
National Historical Publications and Records Commission Grant No. 95-030
Critical Arguements
CA "The Electronic Records Consultant Project grant was carried out in conjunction with the strategic planning effort for the Minnesota Historical Society's State Archives program. The objective was to develop a plan for a program that will be responsive to the changing nature of government records." ... "The strategic plan that was developed calls for specific actions to meet five goals: 1) strengthening partnerships, 2) facilitating the identification of historically valuable records, 3) integrating electronic records into the existing program, 4) providing quality public service, and 5) structuring the State Archives Department to meet the demands of this plan."
Type
Web Page
Title
Minnesota Recordkeeping Metadata Standard (IRM Standard 20, Version 1.2)
<P1> The Minnesota Recordkeeping Metadata Standard is referenced as a "current standard" in the Minnesota Enterprise Technical Architecture under Chapter 4, "Data and Records Management Architecture." State agencies bound by the Architecture should reference that document for compliance requirements. <P2> The Minnesota Recordkeeping Metadata Standard is directly based upon the one developed by the National Archives of Australia (NAA), the Recordkeeping Metadata Standard for Commonwealth Nations, version 1.0, May 1999. (p. 7) <warrant> <P3> The Minnesota Recordkeeping Metadata Standard (Minnesota Office of Technology standard IRM 20) was developed to facilitate records management by government entities at any level of government.
Type
Web Page
Title
Creating and Documenting Text: A Guide to Good Practice
CA "The aim of this Guide is to take users through the basic steps involved in creating and documenting an electronic text or similar digital resource. ... This Guide assumes that the creators of electronic texts have a number of common concerns. For example, that they wish their efforts to remain viable and usable in the long-term, and not to be unduly constrained by the limitations of current hardware and software. Similarly, that they wish others to be able to reuse their work, for the purposes of secondary analysis, extension, or adaptation. They also want the tools, techniques, and standards that they adopt to enable them to capture those aspects of any non-electronic sources which they consider to be significant -- whilst at the same time being practical and cost-effective to implement."
Conclusions
RQ "While a single metadata scheme, adopted and implemented wholescale would be the ideal, it is probable that a proliferation of metadata schemes will emerge and be used by different communities. This makes the current work centred on integrated services and interoperability all the more important. ... The Warwick Framework (http://www.ukoln.ac.uk/metadata/resources/wf.html) for example suggests the concept of a container architecture, which can support the coexistence of several independently developed and maintained metadata packages which may serve other functions (rights management, administrative metadata, etc.). Rather than attempt to provide a metadata scheme for all web resources, the Warwick Framework uses the Dublin Core as a starting point, but allows individual communities to extend this to fit their own subject-specific requirements. This movement towards a more decentralised, modular and community-based solution, where the 'communities of expertise' themselves create the metadata they need has much to offer. In the UK, various funded organisations such as the AHDS (http://ahds.ac.uk/), and projects like ROADS (http://www.ilrt.bris.ac.uk/roads/) and DESIRE (http://www.desire.org/) are all involved in assisting the development of subject-based information gateways that provide metadata-based services tailored to the needs of particular user communities."
"The ERMS Metadata Standard forms Part 2 of the National Archives' 'Requirements for Electronic Records Management Systems' (commonly known as the '2002 Requirements'). It is specified in a technology independent manner, and is aligned with the e-Government Metadata Standard (e-GMS) version 2, April 2003. A version of e-GMS v2 including XML examples was published in the autumn of 2003. This Guide should be read in conjunction with the ERMS Metadata Standard. Readers may find the GovTalk Schema Guidelines (available via http://www.govtalk.gov.uk ) helpful regarding design rules used in building the schemas."
Conclusions
RQ Electronically enabled processes need to generate appropriate records, according to established records management principles. These records need to reach the ERMS that captures them with enough information to enable the ERMS to classify them appropriately, allocate an appropriate retention policy, etc.
SOW
DC This document is a draft.
Type
Web Page
Title
Recordkeeping Metadata Standard for Commonwealth Agencies
This standard describes the metadata that the National Archives of Australia recommends should be captured in the recordkeeping systems used by Commonwealth government agencies. ... Part One of the standard explains the purpose and importance of standardised recordkeeping metadata and details the scope, intended application and features of the standard. Features include: flexibility of application; repeatability of data elements; extensibility to allow for the management of agency-specific recordkeeping requirements; interoperability across systems environments; compatibility with related metadata standards, including the Australian Government Locator Service (AGLS) standard; and interdependency of metadata at the sub-element level.
Critical Arguements
CA Compliance with the Recordkeeping Metadata Standard for Commonwealth Agencies will help agencies to identify, authenticate, describe and manage their electronic records in a systematic and consistent way to meet business, accountability and archival requirements. In this respect the metadata is an electronic recordkeeping aid, similar to the descriptive information captured in file registers, file covers, movement cards, indexes and other registry tools used in the paper-based environment to apply intellectual and physical controls to records.
Conclusions
RQ "The National Archives intends to consult with agencies, vendors and other interested parties on the implementation and continuing evolution of the Recordkeeping Metadata Standard for Commonwealth Agencies." ... "The National Archives expects to re-examine and reissue the standard in response to broad agency feedback and relevant advances in theory and methodology." ... "The development of public key technology is one area the National Archives will monitor closely, in consultation with the Office for Government Online, for possible additions to a future version of the standard."
SOW
DC "This standard has been developed in consultation with recordkeeping software vendors endorsed by the Office for Government OnlineÔÇÖs Shared Systems Initiative, as well as selected Commonwealth agencies." ... "The standard has also been developed with reference to other metadata standards emerging in Australia and overseas to ensure compatibility, as far as practicable, between related resource management tools, including: the Dublin Core-derived Australian Government Locator Service (AGLS) metadata standard for discovery and retrieval of government services and information in web-based environments, co-ordinated by the National Archives of Australia; and the non-sector-specific Recordkeeping Metadata Standards for Managing and Accessing Information Resources in Networked Environments Over Time for Government, Social and Cultural Purposes, co-ordinated by Monash University using an Australian Research Council Strategic Partnership with Industry Research and Training (SPIRT) Support Grant."
This document is a revision and expansion of "Metadata Made Simpler: A guide for libraries," published by NISO Press in 2001.
Publisher
NISO Press
Critical Arguements
CA An overview of what metadata is and does, aimed at librarians and other information professionals. Describes various metadata schemas. Concludes with a bibliography and glossary.
Type
Web Page
Title
Use of Encoded Archival Description (EAD) for Manuscript Collection Finding Aids
Presented in 1999 to the Library's Collection Development & Management Committee, this report outlines support for implementing EAD in delivery of finding aids for library collections over the Web. It describes the limitations of HTML, provides an introduction to SGML, XML, and EAD, outlines the advantages of conversion from HTML to EAD, the conversion process, the proposed outcome, and sources for further information.
Publisher
National Library of Australia
Critical Arguements
CA As use of the World Wide Web has increased, so has the need of users to be able to discover web-based information resources easily and efficiently, and to be able to repeat that discovery in a consistent manner. Using SGML to mark up web-based documents facilitates such resource discovery.
Conclusions
RQ To what extent have the mainstream web browser companies fulfilled their committment to support native viewing of SGML/XML documents?
This guide is optimized for creation of EAD-encoded finding aids for the collections of New York University and New York Historical Society. The links on the page list tools and files that may be downloaded and referenced for production of NYU-conformant finding aids.
Publisher
New York University
Critical Arguements
CA "This guide is optimized for creation of EAD-encoded finding aids for the collections of New York University and New York Historical Society. Instructions assume the use of NoteTab as the XML editor, utilizing template files that serve as base files for the different collections." 
Conclusions
RQ
SOW
DC This guide serves both New York University and the New York Historical Society.
Type
Web Page
Title
Preservation Metadata and the OAIS Information Model: A Metadata Framework to Support the Preservation of Digital Objects
CA "In March 2000, OCLC and RLG sponsored the creation of a working group to explore consensus-building in the area of preservation metadata. ... The charge of the group was to pool their expertise and experience to develop a preservation metadata framework applicable to a broad range of digital preservation activities." (p.1) "The OAIS information model offers a broad categorization of the types of information falling under the scope of preservation metadata; it falls short, however, of providing a decomposition of these information types into a list of metadata elements suitable for practical implementation. It is this need that the working group addressed in the course of its activities, the results of which are reported in this paper." (p. 47)
Conclusions
RQ "The metadata framework described in this paper can serve as a foundation for future work in the area of preservation metadata. Issues of particular importance include strategies and best practices for implementing preservation metadata in an archival system; assessing the degree of descriptive richness required by various types of digital preservation activities; developing algorithms for producing preservation metadata automatically; determining the scope for sharing preservation metadata in a cooperative environment; and moving beyond best practice towards an effort at formal standards building in this area." (47)
SOW
DC "[The OCLC and RLG working group] began its work by publishing a white paper entitled Preservation Metadata for Digital Objects: A Review of the State of the Art, which defined and discussed the concept of preservation metadata, reviewed current thinking and practice in the use of preservation metadata, and identified starting points for consensus-building activity in this area. The group then turned its attention to the main focus of its activity -- the collaborative development of a preservation metadata framework. This paper reports the results of the working groupÔÇÖs efforts in that regard." (p. 1-2)
Joined-up government needs joined-up information systems. The e-Government Metadata Standard (e-GMS) lays down the elements, refinements and encoding schemes to be used by government officers when creating metadata for their information resources or designing search interfaces for information systems. The e-GMS is needed to ensure maximum consistency of metadata across public sector organisations.
Publisher
Office of the e-Envoy, Cabinet Office, UK.
Critical Arguements
CA "The e-GMS is concerned with the particular facets of metadata intended to support resource discovery and records management. The Standard covers the core set of ÔÇÿelementsÔÇÖ that contain data needed for the effective retrieval and management of official information. Each element contains information relating to a particular aspect of the information resource, e.g. 'title' or 'creator'. Further details on the terminology being used in this standard can be found in Dublin Core and Part Two of the e-GIF."
Conclusions
RQ "The e-GMS will need to evolve, to ensure it remains comprehensive and consistent with changes in international standards, and to cater for changes in use and technology. Some of the elements listed here are already marked for further development, needing additional refinements or encoding schemes. To limit disruption and cost to users, all effort will be made to future-proof the e-GMS. In particular we will endeavour: not to remove any elements or refinements; not to rename any elements or refinements; not to add new elements that could contain values contained in the existing elements."
SOW
DC The E-GMS is promulgated by the British government as part of its e-government initiative. It is the technical cornerstone of the e-government policy for joining up the public sector electronically and providing modern, improved public services.
CA Government records and record keeping systems must be accountable and can produce reliable and authentic information and records. A set of criteria was developed by the Ohio Electronic Records Committee to establish the trustworthiness of information systems.
During the past decade, the recordkeeping practices in public and private organizations have been revolutionized. New information technologies from mainframes, to PC's, to local area networks and the Internet have transformed the way state agencies create, use, disseminate, and store information. These new technologies offer a vastly enhanced means of collecting information for and about citizens, communicating within state government and between state agencies and the public, and documenting the business of government. Like other modern organizations, Ohio state agencies face challenges in managing and preserving their records because records are increasingly generated and stored in computer-based information systems. The Ohio Historical Society serves as the official State Archives with responsibility to assist state and local agencies in the preservation of records with enduring value. The Office of the State Records Administrator within the Department of Administrative Services (DAS) provides advice to state agencies on the proper management and disposition of government records. Out of concern over its ability to preserve electronic records with enduring value and assist agencies with electronic records issues, the State Archives has adapted these guidelines from guidelines created by the Kansas State Historical Society. The Kansas State Historical Society, through the Kansas State Historical Records Advisory Board, requested a program development grant from the National Historical Publications and Records Commission to develop policies and guidelines for electronic records management in the state of Kansas. With grant funds, the KSHS hired a consultant, Dr. Margaret Hedstrom, an Associate Professor in the School of Information, University of Michigan and formerly Chief of State Records Advisory Services at the New York State Archives and Records Administration, to draft guidelines that could be tested, revised, and then implemented in Kansas state government.
Notes
These guidelines are part of the ongoing effort to address the electronic records management needs of Ohio state government. As a result, this document continues to undergo changes. The first draft, written by Dr. Margaret Hedstrom, was completed in November of 1997 for the Kansas State Historical Society. That version was reorganized and updated and posted to the KSHS Web site on August 18, 1999. The Kansas Guidelines were modified for use in Ohio during September 2000
Critical Arguements
CA "This publication is about maintaining accountability and preserving important historical records in the electronic age. It is designed to provide guidance to users and managers of computer systems in Ohio government about: the problems associated with managing electronic records, special recordkeeping and accountability concerns that arise in the context of electronic government; archival strategies for the identification, management and preservation of electronic records with enduring value; identification and appropriate disposition of electronic records with short-term value, and
Type
Web Page
Title
Online Archive of California Best Practice Guidelines for Encoded Archival Description, Version 1.1
These guidelines were prepared by the OAC Working Group's Metadata Standards Subcommittee during the spring and summer of 2003. This version of the OAC BPG EAD draws substantially on the
Language
Anonymous
Type
Web Page
Title
President of the Republic's Decree No. 137/2003 of 7 April 2003: "Regulation on Coordination Provisions in Matter of Electronic Signatures"
translated from Italian by Fiorella Foscarini of InterPARES
Conclusions
RQ The differentiation between internal and incoming/outgoing records, which is related to the complexities and costs of such a certification system, may impact the long-term preservation of heavy-signed and light-signed records and poses questions about different records' legal values and organizations' accountability. The paragraph about cut-back refers to the destruction of documents, not records. It is, however, significantly ambiguous.
SOW
DC Modifies the President of the Republic's Decree No. 445/2000 of 28 December 2000.
Type
Web Page
Title
Legislative Decree No. 10 of 23 January 2002: "Acknowledgement of the Directive No. 1999/93/CE on a Community Framework for Electronic Signatures"
translated from Italian by Fiorella Foscarini of InterPARES
Critical Arguements
CA Italian implementation of E.U. Directive No. 1999/93/CE on a Community Framework for Electronic Signatures. Article 6 (which replaces Article 10 of DPR 445/2000) defines the form and effectiveness of electronic records.
Type
Web Page
Title
President of the Republic's Decree No. 445/2000 of 8 December 2000: "Testo unico delle disposizioni legislative e regolamentari in materia di documentazione amministrativa"
Requirements for Electronic Records Management Systems includes: (1) "Functional Requirements" (http://www.nationalarchives.gov.uk/electronicrecords/reqs2002/pdf/requirementsfinal.pdf); (2) "Metadata Standard" (the subject of this record); (3) Reference Document (http://www.nationalarchives.gov.uk/electronicrecords/reqs2002/pdf/referencefinal.pdf); and (4) "Implementation Guidance: Configuration and Metadata Issues" (http://www.nationalarchives.gov.uk/electronicrecords/reqs2002/pdf/implementation.pdf)
Publisher
Public Records Office, [British] National Archives
Critical Arguements
CA Sets out the implications for records management metadata in compliant systems. It has been agreed with the Office of the e-Envoy that this document will form the basis for an XML schema to support the exchange of records metadata and promote interoperability between ERMS and other systems
SOW
DC The National Archives updated the functional requirements for electronic records management systems (ERMS) in collaboration with the central government records management community during 2002. The revision takes account of developments in cross-government and international standards since 1999.
Type
Web Page
Title
Record Keeping Metadata Requirements for the Government of Canada
This document comprises descriptions for metadata elements utilized by the Canadian Government as of January 2001.
Critical Arguements
CA "The Record Keeping Metadata is defined broadly to include the type of information Departments are required to capture to describe the identity, authenticity, content, context, structure and management requirements of records created in the context of a business activity. The Metadata model consists of elements, which are the attributes of a record that are comparable to fields in a database. The model is modular in nature. It permits Departments to use a core set of elements that will meet the minimum requirements for describing and sharing information, while facilitating interoperability between government Departments. It also allows Departments with specialized needs or the need for more detailed descriptions to add new elements and/or sub-elements to the basic metadata in order to satisfy their particular business requirements."
Type
Web Page
Title
Capturing Electronic Transactional Evidence: The Future
To ensure that the digital collections submitted to RLG Cultural Materials can be discovered and understood, RLG has compiled these Descriptive Metadata Guidelines for contributors. While these guidelines reflect the needs of one particular service, they also represent a case study in information sharing across community and national boundaries. RLG Cultural Materials engages a wide range of contributors with different local practices and institutional priorities. Since it is impossible to find -- and impractical to impose -- one universally applicable standard as a submission format, RLG encourages contributors to follow the suite of standards applicable to their particular community (p.1).
Critical Arguements
CA "These guidelines . . . do not set a new standard for metadata submission, but rather support a baseline that can be met by any number of strategies, enabling participating institutions to leverage their local descriptions. These guidelines also highlight the types of metadata that enhance functionality for RLG Cultural Materials. After a contributor submits a collection, RLG maps that description into the RLG Cultural Materials database using the RLG Cultural Materials data model. This ensures that metadata from the various participant communities is integrated for efficient searching and retrieval" (p.1).
Conclusions
RQ Not applicable.
SOW
DC RLG comprises more than 150 research and cultural memory institutions, and RLG Cultural Materials elicits contributions from countless museums, archives, and libraries from around the world that, although they might retain local descriptive standards and metadata schemas, must conform to the baseline standards prescribed in this document in order to integrate into RLG Cultural Materials. Appendix A represents and evaluates the most common metadata standards with which RLG Cultural Materians is able to work.
Type
Web Page
Title
The MPEG-21 Rights Expression Language: A White Paper
CA Presents the business case for a Digital Rights Expression Language, an overview of the DRM landscape, a discussion of the history and role of standards in business, and some technical aspects of MPEG-21. "[U]nless the rights to ... content can be packaged within machine-readable licences, guaranteed to be ubiquitous, unambiguous and secure, which can then be processed consistently and reliably, it is unlikely that content owners will trust consign [sic] their content to networks. The MPEG Rights Expression Language (REL) is designed to provide the functionality required by content owners in order to create reliable, secure licences for content which can be used throughout the value chain, from content creator to content consumer."
Conclusions
RQ "While true interoperability may still be a distant prospect, a common rights expression language, with extensions based on the MPEG REL, can incrementally bring many of the benefits true interoperability will eventually yield. As extensions are created in multiple content verticals, it will be possible to transfer content generated in one securely to another. This will lead to cross channel fertilisation and the growth of multimedia content. At the same time, a common rights language will also lead to the possibility of broader content distribution (by enabling cross-DRM portability), thus providing more channel choice for consumers. It is this vision of the MPEG REL spreading out that is such an exciting prospect. ... The history of MPEG standards would seem to suggest that implementers will start building to the specification in mid-2003, coincidental with the completion of the standard. This will be followed by extensive take-up within two or three years, so that by mid 2006, the MPEG REL will be a pervasive technology, implemented across many different digital rights management and conditional access systems, in both the content industries and in other, non-rights based industries. ... The REL will ultimately become a 'transparent' technology, as invisible to the user as the phone infrastructure is today."
SOW
DC DC The Moving Picture Experts Group (MPEG) is a working group of ISO/IEC, made up of some 350 members from various industries and universities, in charge of the development of international standards for compression, decompression, processing, and coded representation of moving pictures, audio and their combination. MPEG's official designation is ISO/IEC JTC1/SC29/WG11. So far MPEG has produced the following compression formats and ancillary standards: MPEG-1, the standard for storage and retrieval of moving pictures and audio on storage media (approved Nov. 1992); MPEG-2, the standard for digital television (approved Nov. 1994); MPEG-4, the standard for multimedia applications; MPEG-7, the content representation standard for multimedia information search, filtering, management and processing; and MPEG-21, the multimedia framework.
Expanded version of the article "Ensuring the Longevity of Digital Documents" that appeared in the January 1995 edition of Scientific American (Vol. 272, Number 1, pp. 42-7).
Publisher
Council on Library and Information Resources
Critical Arguements
CA "It is widely accepted that information technology is revolutionizing our concepts of documents and records in an upheaval at least as great as the introduction of printing, if not of writing itself. The current generation of digital records therefore has unique historical significance; yet our digital documents are far more fragile than paper. In fact, the record of the entire present period of history is in jeopardy. The content and historical value of many governmental, organizational, legal, financial, and technical records, scientific databases, and personal documents may be irretrievably lost to future generations if we do not take steps to preserve them."
Conclusions
RQ "We must develop evolving standards for encoding explanatory annotations to bootstrap the interpretation of digital documents that are saved in nonstandard forms. We must develop techniques for saving the bit streams of software-dependent documents and their associated systems and application software. We must ensure that the hardware environments necessary to run this software are described in sufficient detail to allow their future emulation. We must save these specifications as digital documents, encoded using the bootstrap standards developed for saving annotations so that they can be read without special software (lest we be recursively forced to emulate one system in order to learn how to emulate another). We must associate contextual information with our digital documents to provide provenance as well as explanatory annotations in a form that can be translated into successive standards so as to remain easily readable. Finally, we must ensure the systematic and continual migration of digital documents onto new media, preserving document and program bit streams verbatim, while translating their contextual information as necessary."
Type
Web Page
Title
Interactive Fiction Metadata Element Set version 1.1, IFMES 1.1 Specification
This document defines a set of metadata elements for describing Interactive Fiction games. These elements incorporate and enhance most of the previous metadata formats currently in use for Interactive Fiction, and attempts to bridge them to modern standards such as the Dublin Core.
Critical Arguements
CA "There are already many metadata standards in use, both in the Interactive Fiction community and the internet at large. The standards used by the IF community cover a range of technologies, but none are fully compatible with bleeding-edge internet technology like the Semantic Web. Broader-based formats such as the Dublin Core are designed for the Semantic Web, but lack the specialized fields needed to describe Interactive Fiction. The Interactive Fiction Metadata Element Set was designed with three purposes. One, to fill in the specialized elements that Dublin Core lacks. Two, to unify the various metadata formats already in use in the IF community into a single standard. Three, to bridge these older standards to the Dublin Core element set by means of the RDF subclassing system. It is not IFMES's goal to provide every single metadata element needed. RDF, XML, and other namespace-aware languages can freely mix different vocabularies, therefore IFMES does not subclass Dublin Core elements that do not relate to previous Interactive Fiction metadata standards. For these elements, IFMES recommends using the existing Dublin Core vocabulary, to maximize interoperability with other tools and communities."
Conclusions
RQ "Several of the IFMES elements can take multiple values. Finding a standard method of expressing multiple values is tricky. The approved method in RDF is either to repeat the predicate with different objects, or create a container as a child object. However, some RDF parsers don't work well with either of these methods, and many other languages don't allow them at all. XML has a value list format in which the values are separated with spaces, however this precludes spaces from appearing within the values themselves. A few legacy HTML attributes whose content models were never formally defined used commas to separate values that might contain spaces, and a few URI schemes accept multiple values separated by semicolons. The IFMES discussion group continues to examine this problem, and hopes to have a well-defined solution by the time this document reaches Candidate Recommendation status. For the time being IFMES recommends repeating the elements whenever possible, and using a container when that fails (for example, JSON could set the value to an Array). If an implementation simply must concatenate the values into a single string, the recommended separator is a space for URI and numeric types, and a comma followed by a space for text types."
SOW
DC The authors are writers and programmers in the interactive fiction community.
This standard sets out principles for making and keeping full and accurate records as required under section 12(1) of the State Records Act 1998. The principles are: Records must be made; Records must be accurate; Records must be authentic; Records must have integrity; Records must be useable. Each principle is supported by mandatory compliance requirements.
Critical Arguements
CA "Section 21(1) of the State Records Act 1998 requires public offices to 'make and keep full and accurate records'. The purpose of this standard is to assist public offices to meet this obligation and to provide a benchmark against which a public office's compliance may be measured."
Conclusions
RQ None
SOW
DC This standard is promulgated by the State Records Agency of New South Wales, Australia, as required under section 12(1) of the State Records Act 1998.
CA NSW has issued their metadata standard because one of the ÔÇ£key methodsÔÇØ for assuring the long-term preservation of e-records is through he use of standardized sets of recordkeeping metadata. Not only can their metadata strategy help public offices meet their individual requirements for accu
Type
Web Page
Title
Archiving of Electronic Digital Data and Records in the Swiss Federal Archives (ARELDA): e-government project ARELDA - Management Summary
The goal of the ARELDA project is to find long-term solutions for the archiving of digital records in the Swiss Federal Archives. This includes the accession, the long-term storage, preservation of data, description, and access for the users of the Swiss Federal Archives. It is also coordinated with the basic efforts of the Federal Archives to realize a uniform records management solution in the federal administration and therefore to support the pre-archival creation of documents of archival value for the benefits of the administration as well as of the Federal Archives. The project is indispensable for the long-term execution of the Federal Archives Act; Older IT systems are being replaced by newer ones. A complete migration of the data is sometimes not possible or too expensive; A constant increase of small database applications, built and maintained by people with no IT background; More and more administrative bodies are introducing records and document management systems.
Publisher
Swiss Federal Archives
Publication Location
Bern
Critical Arguements
CA "Archiving in general is a necessary prerequisite for the reconstruction of governmental activities as well as for the principle of legal certainty. It enables citizens to understand governmental activities and ensures a democratic control of the federal administration. And finally are archives a prerequisite for the scientific research, especially in the social and historical fields and ensure the preservation of our cultural heritage. It plays a vital role for an ongoing and efficient records management. A necessary prerequisite for the Federal Archives in the era of the information society will be the system ARELDA (Archiving of Electronic Data and Records)."
Conclusions
RQ "Because of the lack of standard solutions and limited or lacking personal resources for an internal development effort, the realisation of ARELDA will have to be outsourced and the cooperation with the IT division and the Federal Office for Information Technology, Systems and Telecommunication must be intensified. The guidelines for the projects are as follows:
SOW
DC ARELDA is one of the five key projects in the Swiss government's e-government strategy.
Type
Web Page
Title
Metadata Resources: Metadata Encoding and Transmission Standard (METS)
DC OAIS emerged out of an initiative spearheaded by NASA's Consultative Committee for Space Data Systems. It has been shaped and promoted by the RLG and OCLC. Several international projects have played key roles in shaping the OAIS model and adapting it for use in libraries, archives and research repositories. OAIS-modeled repositories include the CEDARS Project, Harvard's Digital Repository, Koninklijke Bibliotheek (KB), the Library of Congress' Archival Information Package for audiovisual materials, MIT's D-Space, OCLC's Digital Archive and TERM: the Texas Email Repository Model.
Museums and the Online Archive of California (MOAC) builds on existing standards and their implementation guidelines provided by the Online Archive of California (OAC) and its parent organization, the California Digital Library (CDL). Setting project standards for MOAC consisted of interpreting existing OAC/CDL documents and adapting them to the projects specific needs, while at the same time maintaining compliance with OAC/CDL guidelines. The present overview over the MOAC technical standards references both the OAC/CDL umbrella document and the MOAC implementation / adaptation document at the beginning of each section, as well as related resources which provide more detail on project specifications.
Critical Arguements
CA The project implements specifications for digital image production, as well as three interlocking file exchange formats for delivering collections, digital images and their respective metadata. Encoded Archival Description (EAD) XML describes the hierarchy of a collection down to the item-level and traditionally serves for discovering both the collection and the individual items within it. For viewing multiple images associated with a single object record, MOAC utilizes Making of America 2 (MOA2) XML. MOA2 makes the images representing an item available to the viewer through a navigable table of contents; the display mimics the behavior of the analog item by e.g. allowing end-users to browse through the pages of an artist's book. Through the further extension of MOA2 with Text Encoding Initiative (TEI) Lite XML, not only does every single page of the book display in its correct order, but a transcription of its textual content also accompanies the digital images.
Conclusions
RQ "These two instances of fairly significant changes in the project's specifications may serve as a gentle reminder that despite its solid foundation in standards, the MOAC information architecture will continue to face the challenge of an ever-changing technical environment."
SOW
DC The author is Digital Media Developer at the UC Berkeley Art Museum & Pacific Film Archives, a member of the MOAC consortium.
Type
Web Page
Title
Imaging Nuggets: Metadata Encoding and Transmission Standard
CA The main advantages of METS consists of the following: First, it provides a syntax for transferring the entire digital objects along with their associated metadata and other supporting files. Second, it provides a functional syntax, a basis for providing users the means of navigating through and manipulating the object. Third, it provides a syntax for archiving the data as an integrated whole.
CA One problem in the field of radio archives is the tendency to view anything that is not audio or video (specifically this leaves text) as metadata. However, all text is not metadata. While all text can be seen as potentially useful due to the information it represents, the creators of P/FRA recommend standardizing only the essential information needed to describe and retrieve radio archive information.
Conclusions
RQ Rules need to be drafted specifying the content of metadata fields. While the authors extol the value of ÔÇ£good metadataÔÇØ for resource discovery, proscribing the content of metadata containers is a problem here as in every other filed.