The author reviews a range of the research questions still unanswered by research on the capture of metadata required for recordness. These include how to maintain inviolable linkages between records and their metadata in a variety of architectures, what structure metadata content should take, the semantics of records metadata and that of other electronic sources, how new metadata can be acquired by records over time, maintaining the meaning of contextual metadata over time, the use of metadata in records management and the design of environments in which Business Acceptable Communications ÔÇô BAC ÔÇô (those with appropriate evidential metadata) can persist.
Critical Arguements
CA "My research consists of model building which enables the construction of theories and parallel implementations based on shared assumptions. Some of these models are now being tested in applications, so this report reflects both what we do not yet know from abstract constructs and questions being generated by field testing. " ... Bearman overviews research questions such as semantics, syntax, structure and persistence of metadata that still need to be addressed.
Phrases
<P1> Records are evidence when they are bound to appropriate metadata about their content, structure and context. <P2> The metadata required for evidence is described in the Reference Model for Business Acceptable Communications (BAC). <P3> Metadata which is required for evidence must continue to be associated with the record to which it relates over time and neither it nor the record content can be alterable. <P4> To date we have only identified three implementations which, logically, could allow metadata to retain this inviolable connection. Metadata can be: kept in a common envelope WITH a record (encapsulated), bound TO a record (by integrity controls within an environment), or LINKED with a record through a technical and/or social process (registration, key deposit, etc.). <P5> Metadata content was defined in order to satisfy a range of functional requirements of records, hence it ought to have a structure which enables it to serve these functions effectively and in concrete network implementations. <warrant> <P6> Clusters of metadata are must operate together. Clusters of metadata are required by different processes which take place at different times, for different software clients, and within a variety of processes. Distinct functions will need access to specified metadata substructures and must be able to act on these appropriately. Structures have been proposed in the Reference Model for Business Acceptable Communications. <P7> Metadata required for recordness must, logically, be standard; that required for administration of recordkeeping systems is extensible and locally variable. <P8> Records metadata must be semantically homogenous but it is probably desirable for it to be syntactically heterogeneous and for a range of protocols to operate against it. Records metadata management system requirements have both an internal and external aspect; internally they satisfy management requirements while externally they satisfy on-going recordness requirements. <P9> The metadata has to come either from a specific user/session or from rules defined to extract data either from a layer in the application or a layer between the application and the recording event. <P10> A representation of the business context must exist from which the record-creating event can obtain metadata values. <P11> Structural metadata must both define the dependent structures and identify them to a records management environment which is ÔÇ£patrollingÔÇØ for dependencies which are becoming risky in the evolving environment in order to identify needs for migration. <P12> BAC conformant environments could reduce overheads and, if standards supported the uniform management of records from the point of issue to the point of receipt. Could redundancy now imposed by both paper and electronic processes be dramatically reduced if records referenced other records? <P13>
Conclusions
RQ "All the proposed methods have some degree of external dependency. What are the implications software dependencies? Encapsulation, integrity controls and technico-social process are all software dependent. Is this avoidable? Can abstract reference models of the metadata captured by these methods serve to make them effectively software independent? " ... "What are the relative overhead costs of maintaining the systems which give adequate societal assurances of records retention following any of these approaches? Are there some strategies that are currently more efficient or effective? What are the organizational requirements for implementing metadata capture systems? In particular, what would the costs of building such systems within a single institution be versus the costs of implementing records metadata adhering communications servers on a universal scale?" ... "Can we model mechanisms to enable an integrated environment of recordkeeping throughout society for all electronically communicated transactions?" ... "Are the BAC structures workable? Complete? Extensible in ways that are known to be required? For example, metadata required for ÔÇ£recordnessÔÇØ is created at the time of the creation of the records but other metadata, as premised by the Warwick Framework, 2 may be created subsequently. Are these packets of metadata orthogonal with respect to recordness? If not, how are conflicts dealt with? " ... "Not all metadata references fixed facts. Thus, for example, we have premised that proper reference to a retention schedule is a citation to an external source rather than a date given within the metadata values of a record. Similar external references are required for administration of shifting access permissions. What role can registries (especially rights clearinghouses) play in a world of electronic records? How well do existing languages for permission management map to the requirements of records administration, privacy and confidentiality protection, security management, records retention and destruction, etc." ... "Not all records will be created with equally perfect metadata. Indeed risk-based decisions taken by organizations in structuring their recordsÔÇÖ capture are likely to result in conscious decisions to exclude certain evidential metadata. What are the implications of incomplete metadata on an individual organization level and on a societal level? Does the absence of data as a result of policy need to be noted? And if so, how?" ... "Since metadata has owners, howdo owners administer recordsÔÇÖ metadata over time? In particular, since records contain records, how are the layers of metadata exposed for management and administrative needs (if internal metadata documenting dependencies can slip through the migration process, we will end up with records that cannot serve as evidence. If protected records within unprotected records are not protected, we will end up with insecure records environments, etc. etc.)." ... "In principle, the BAC could be expressed as Dublin metadata 3 and insofar as it cannot be, the Dublin metadata will be inadequate for evidence. What other syntax could be used? How could these be comparatively tested?" .. "Could Dublin Core metadata, if extended by qualifying schema, serve the requirements of recordness? Records are, after all, documents in the Dublin sense of fixed information objects. What would the knowledge representation look like?" ... "Strategies for metadata capture currently locate the source of metadata either in the API layer, or the communications system, using data provided by the application (an analysis supports defining which data and where they can be obtained), from the user interface layer, or from the business rules defined for specified types of communication pathways. Can all the required metadata be obtained by some combination of these sources? In other words, can all the metadata be acquired from sources other than content created by the record-creator for the explicit and sole purpose of documentation (since such data is both suspect in itself and the demand for it is annoying to the end user)? " ... "Does the capture of metadata from the surrounding software layers require the implementation of a business-application specific engine, or can we design generic tools that provide the means by which even legacy computing systems can create evidential records if the communication process captures the interchange arising from a record-event and binds it with appropriate metadata?" ... "What kinds of representations of business processes and structures can best carry contextualizing metadata at this level of granularity and simultaneously serve end user requirements? Are the discovery and documentation representations of provenance going to have to be different? " ... "Can a generic level of representation of context be shared? Do standards such a STEP 4 provide adequate semantic rules to enable some meaningful exchange of business context information? " ... "Using past experiences of expired standards as an indicator, can the defined structural metadata support necessary migrations? Are the formal standards of the source and target environments adequate for actual record migration to occur?" ... "What metadata is required to document a migration itself?" ... "Reduction of redundancy requires record uses to impose post-creation metadata locks on records created with different retention and access controls. To what extent is the Warwick Framework relevant to these packets and can architectures be created to manage these without their costs exceeding the savings?" ... "A number of issues about proper implementation depend on the evolution (currently very rapid) of metadata strategies in the broader Internet community. Issues such as unique identification of records, external references for metadata values, models for metadata syntax, etc. cannot be resolved for records without reference to the ways in which the wider community is addressing them. Studies that are supported for metadata capture methods need to be aware of, and flexible in reference to, such developments."
The Getty Art History Information Program: Research Agenda for Cultural Heritage on Information Networks
Publication Year
1995
Critical Arguements
CA The inability to effectively preserve and authenticate electronic records presents a significant problem for the humanities research, which depends on correct attribution and the ability to view resources long after they were created.
Phrases
<P1> Current research on software dependence and interoperability is not largely driven by archival concerns and takes a relatively short view on the requirement to preserve functionality. Little research has been done on modeling the information loss that accompanies multiple migrations or the risks inherent in the use of commercial systems before standards are developed, yet these are the critical questions being posed by archives. (p.2) <P2> The metadata required for recordness and the means to capture this data and ensure that it is bonded to electronic communications is the most significant area for research in the near future. (p.3) <P3> Within organizations, archivists must find automatic means of identifying the business process for which a record is generated. Such data modeling will become increasingly critical in an era of ongoing business re-engineering. If records are retained for their evidential significance and for a period associated with risk, then certain knowledge of their functional source is essential to their rational control. If they are retained for long-term informational value, knowledge of context is necessary to understand their significance. (p.3) <warrant>
Conclusions
RQ We need to research what value e-records have other than as a means of assessing accountability. How are they used, and what value do users derive from them? What do we need to know about a record's content to support the discovery of billions of records? How can our preservation solutions be made scaleable?
Type
Journal
Title
Archival Issues in Network Electronic Publications
"Archives are retained information systems that are developed according to professional principles to meet anticipated demands of user clienteles in the context of the changing conditions created by legal environments and electronic or digital technologies. This article addresses issues in electronic publishing, including authentication, mutability, reformatting, preservation, and standards from an archival perspective. To ensure continuing access to electronically published texts, a special emphasis is placed on policy planning in the development and implementation of electronic systems" (p.701).
Critical Arguements
<P1> Archives are established, administered, and evaluated by institutions, organizations, and individuals to ensure the retention, preservation, and utilization of archival holdings (p.701) <P2> The three principal categories of archival materials are official files of institutions and organizations, publications issued by such bodies, and personal papers of individuals. . . . Electronic information technologies have had profound effects on aspects of all these categories (p.702) <P3> The primary archival concern with regard to electronic publishing is that the published material should be transferred to archival custody. When the transfer occurs, the archivist must address the issues of authentication, appraisal, arrangement, description, and preservation or physical protection (p.702) <P4> The most effective way to satisfy archival requirements for handling electronic information is the establishment of procedures and standards to ensure that valuable material is promptly transferred to archival custody in a format which will permit access on equipment that will be readily available in the future (p.702) <P5> Long-term costs and access requirements are the crucial factors in determining how much information should be retained in electronic formats (p.703) <P6> Authentication involves a determination of the validity or integrity of information. Integrity requires the unbroked custody of a body of information by a responsible authority or individual <warrant> (p.703) <P7> From an archival perspective, the value of information is dependent on its content and the custodial responsibility of the agency that maintains it -- e.g., the source determines authenticity. The authentication of archival information requires that it be verified as to source, date, and content <warrant> (p.704) <P8> Information that is mutable, modifiable, or changeable loses its validity if the persons adding, altering, or deleting information cannot be identified and the time, place and nature of the changes is unknown (p.704) <P9> [P]reservation is more a matter of access to information than it is a question of survival of any physical information storage media (p.704) <P10> [T]o approach the preservation of electronic texts by focusing on physical threats will miss the far more pressing matter of ensuring continued accessibility to the information on such storage media (p.706) <P11> If the information is to remain accessible as long as paper, preservation must be a front-end, rather than an ex post facto, action (p.708) <P12> [T]he preservation of electronic texts is first and foremost a matter of editorial and administrative policy rather than of techniques and materials (p.708) <P13> Ultimately, the preservation of electronic publications cannot be solely an archival issue but an administrative one that can be addressed only if the creators and publishers take an active role in providing resources necessary to ensure that ongoing accesibility is part of initial system and product design (p.709) <P14> An encouraging development is that SGML has been considered to be a critical element for electronic publishing because of its transportability and because it supports multiple representations of a single text . . . (p.711) <P15> Underlying all questions of access is the fundamental consideration of cost (p.711)
Type
Journal
Title
The Management of Digital Data: A metadata approach
CA "Super-metadata may well play a crucial role both in facilitating access to DDOs and in providing a means of selecting and managing the maintenance of these DDOs over time."
Phrases
<P1> The preservation of the intellectual content of DDOs brings into focus a major issue: "the integrity and authenticity of the information as originally recorded" (Graham, 1997). (p.365). <P2> The emergence of dynamic and living DDOs is presenting challenges to the conventional understanding of the preservation of digital resources and is forcing many organizations to reevaluate their strategies in the light of these rapid advances in information sources. The use of appropriate metadata is recognized to be essential in ensuring continued access to dynamic and living DDOs, but the standards for such metadata are not yet fully understood or developed. (p.369)
Conclusions
RQ How can we decide what to preserve ? How can we assure long-term access? What will be the cost of electronic archiving? Which metadata schema will be in use 10 years from now, and how will migration be achieved?
Type
Journal
Title
Warrant and the Defintion of Electronic Records: Questions Arising from the Pittsburgh Project
The University of Pittsburgh Electronic Recordkeeping Research Project established a model for developing functional requirements and metadata specifications based on warrant, defined as the laws, regulations, best practices, and customs that regulate recordkeeping. Research has shown that warrant can also increase the acceptance by records creators and others of functional requirements for recordkeeping. This article identifies areas related to warrant that require future study. The authors conclude by suggesting that requirements for recordkeeping may vary from country to country and industry to industry because of differing warrant.
Publisher
Kluwer Academic Publishers
Publication Location
The Netherlands
Critical Arguements
CA Poses a long series of questions and issues concerning warrant and its ability to increase the acceptance of recordkeeping requirements. Proposes that research be done to answer these questions. Discusses two different views about whether warrant can be universal and/or international.
Phrases
<P1> As we proceeded with the project [the University of Pittsburgh Electronic Recordkeeping Research Project] we ultimately turned our attention to the idea of the literary warrant -- defined as the mandate from law, professional best practices, and other social sources requiring the creation and continued maintenance of records. Wendy Duff's doctoral research found that warrant can increase the acceptance of some recordkeeping functional requirements, and therefore it has the potential to build bridges between archival professionals and others concerned with or responsible for recordkeeping. We did not anticipate the value of the literary warrant and, in the hindsight now available to us, the concept of the warrant may turn out to be the most important outcome of the project. <P2> In Wendy Duff's dissertation, legal, auditing and information science experts evluated the authority of the sources of warrant for recordkeeping. This part of the study provided evidence that information technology standards may lack authority, but this finding requires further study. Moreover, the number of individuals who evaluated the sources of warrant was extremely small. A much larger number of standards should be included in a subsequent study and a greater number of subjects are needed to evaluate these standards. <P3> We found a strong relationship between warrant and the functional requirements for electronic recordkeeping systems. Research that studies this relationship and determines the different facets that may affect it might provide more insights into the relationship between the warrant and the functional requirements. <P4> [W]e need to develop a better understanding of the degree to which the warrant for recordkeeping operates in various industries, disciplines, and other venues. Some institutions operate in a much more regulated environment than others, suggesting that the imporance of records and the understanding of records may vary considerably between institutional types, across disciplines and from country to country. <P5> We need to consider whether the recordkeeping functional requirements for evidence hold up or need to be revised for recordkeeping requirements for corporate memory, accountability, and cultural value -- the three broad realms now being used to discuss records and recordkeeping. <P6> The warrant gathered to date has primarily focused on technical, legal or the administrative value of records. A study that tested the effectiveness of warrant that supported the cultural or historical mandate of archives might help archivists gain support for their archival programs. <P7> This concern leads us to a need for more research about the understanding of records and recordkeeping in particular institutions, disciplines, and societies. <P8> A broader, and perhaps equally important question, is whether individual professionals and workers are even aware of their regulatory environment. <P9> How do the notion of the warrant and the recordkeeping functional requirements relate to the ways in which organizations work and the management tools they use, such as business process reengineering and data warehousing? <P10> What are the economic implications for organizations to comply with the functional requirements for recordkeeping in evidence? <P11> Is there a warrant and separate recordkeeping functional requirements for individual or personal recordkeeping? <P12> As more individuals, especially writers, financial leaders, and corporate and societal innovators, adopt electronic information technologies for the creation of their records, an understanding of the degree of warrant for such activity and our ability to use this warrant to manage these recordkeeping systems must be developed. <P13> We believe that archivists and records managers can imporve their image if they become experts in all aspects of recordkeeping. This will require a thorough knowledge of the legal, auditing, information technology, and management warrant for recordkeeping. <P14> The medical profession emphasizes that [sic] need to practice evidence-based medicine. We need to find out what would happen if records managers followed suit, and emphasized and practiced warrant-based recordkeeping. Would this require a major change in what we do, or would it simply be a new way to describe what we have always done? <P15> More work also has to be done on the implications of warrant and the functional requirements for the development of viable archives and records management programs. <P16> The warrant concept, along with the recordkeeping functional requirements, seem to possess immense pedagogical implications for what future archivists or practicing archivists, seeking to update their skills, should or would be taught. <P17> We need to determine the effectiveness of using the warrant and recordkeeping functional requirements as a basis for graduate archival and records management education and for developing needed topics for research by masters and doctoral students. <P18> The next generation of educational programs might be those located in other professional schools, focusing on the particular requirements for records in such institutions as corporations, hospitals, and the courts. <P19> We also need to determine the effectiveness of using the warrant and recordkeeping functional requirements in continuing education, public outreach, and advocacy for helping policy makers, resource allocators, administrators, and others to understand the importance of archives and records. Can the warrant and recordkeeping functional requirements support or foster stronger partnerships with other professions, citizen action groups, and other bodies interested in accountability in public organizations and government? <P20> Focusing on the mandate to keep and manage records, instead of the records as artifacts or intersting stuff, seems much more relevant in late twentieth century society. <P21> We need to investigate the degree to which records managers and archivists can develop a universal method for recordkeeping. ... Our laws, regulations, and best practices are usually different from country to country. Therefore, must any initiative to develop warrant also be bounded by our borders? <P22> A fundamental difference between the Pittsburgh Project and the UBC project is that UBC wishes to develop a method for managing and preserving electronic records that is applicable across all juridical systems and cultures, while the Pittsburgh Project is proposing a model that enables recordkeeping to be both universal and local at the same time. <P23> We now have a records management standard from Australia which is relevant for most North American records programs. It has been proposed as an international standard, although it is facing opposition from some European countries. Can there be an international standard for recordkeeping and can we develop one set of procedures which will be accepted across nations? Or must methods of recordkeeping be adapted to suit specific cultures, juridical systems, or industries?
Over the last decade a number of writers have encouraged archivists to develop strategies and tactics to redefine their role and to insert themselves into the process of designing recordkeeping systems. This paper urges archivists to exploit the authority inherent in the laws, regulations, standards, and professional best practices that dictate recordkeeping specifications to gain great acceptance for the requirements for electronic evidence. Furthermore, it postulates that this proactive approach could assist in gaining greater respect for the archival profession.
Critical Arguements
CA The use of authoritative sources of warrant would improve acceptance of electronic records as evidence and create greater respect for the archival profession.
Phrases
<P1> The legal, administrative, fiscal, or information value of records is dependent upon the degree of trust society places in records as reliable testimony or evidence of the acts they purport to document. In turn, this trust is dependent on society's faith in the procedures that control the creation and maintenance of the record. <P2> [S]ociety bestows some methods of recordkeeping and record creating with an authority or 'warrant' for generating reliable records. <P3> David Bearman first proposed the idea of "literary warrant." <P4> [S]tatements of warrant provide clear instructions on how records should be kept and delineate elements needed for the records to be complete. <P5> The information technology field promulgates standards, but in North America adherence to them is voluntary rather than obligatory. <P6> The University of Pittsburgh Electronic Recordkeeping Project suggested that requirements for electronic recordkeeping should derive from authoritative sources, such as the law, customs, standards, and professional best practices accepted by society and codified in the literature of different professions concerned with records and recordkeeping rather than developed in isolation. <P7> On their own, archival requirements for recordkeeping have very little authority as no authoritative agencies such as standards boards or professional associations have yet to endorse them [sic] and few archivists have the authority to insist that their organizations follow them. <P8> An NHPRC study suggested that archivists have not been involved in the process of meeting the challenges of electronic records because they are undervalued by their colleagues, or, in other words, are not viewed as a credible source.
Conclusions
RQ "By highlighting the similarity between recordkeeping requirements and the requirements delineated in authoritative statements in the law, auditing standards, and professional best practices, archivists will increase the power of their message. ... If archivists are to take their rightful place as regulators of an organization's documentary requirements, they will have to reach beyond their own professional literature and understand the requirements for recordkeeping imposed by other professions and society in general. Furthermore, they will have to study methods of increasing the accpetance of their message and the impact and power of warrant."
Type
Journal
Title
Will Metadata Replace Archival Description: A Commentary
CA Before archival description can be replaced by metadata, "archivists must first study their user needs, identify processes that protect the integrity and impartiality of records, and ensure the capture of important contextual information." (p.38)
Phrases
<P1> Unfortunately, information systems often do not create records, concentrating instead on the preservation of information to the detriment of recordkeeping. Concern over this issue has lead Wallace to promote a new role for archivists, one that places them at the conception of the life cycle, establishing standards for record preservation and management as well as dictating record creation. Demarcation between archivists and records managers disappears in this new paradigm, and a new role as auditor, system designer, and regulator begins to emerge. (p.34) <P2> "Metadata are essential if archivists are to maintain the integrity and authenticity of evidence of actions. McNeil likens metadata systems to protocol registers and sees metadata itself as evidence, as well as a means of preserving evidence." (p.35)
Conclusions
RQ Will metadata replace archival description? Will metadata requirements fulfill the needs of secondary users? Will metadata require secondary descriptions?
Type
Journal
Title
Law, evidence and electronic records: A strategic perspective from the global periphery
CA A recordkeeping paradigm set up around the records continuum will take us into the future, because it sees opportunities, not problems, in e-environments. It fosters accountability through evidence-generating recordkeeping practices.
Phrases
<P1> This challenge is being addressed by what Chris Hurley has called second-generation archival law, which stretches the reach of archival jurisdictions into the domain of the record-creator. A good example of such archival law is South Africa's National Archives Act of 1996, which gives the National Archives regulatory authority over all public records from the moment of their creation. The Act provides a separate definition of "electronic records systems" and accords the National Archives specific powers in relation to their management. Also significant is that the Act brings within the National Archives' jurisdiction those categories of record-creators commonly allowed exclusion -- the security establishment, public services outside formal structures of government, and "privatized" public service agencies. (p.34) <P2> A characteristic (if an absence can be a characteristic) of most archival laws, first and second generation, is a failure to define either the conditions/processes requiring "recording" or the generic attributes of a "record." (p.34) <P3> Archival law, narrowly defined, is not at the cutting edge and is an increasingly small component of broader recordkeeping regimes. This is one of the many signs of an emerging post-custodial era, which Chris Hurley speculates will be informed by a third generation of archival law. Here, the boundaries between recordkeeping domains dissolve, with all of them being controlled by universal rules. (p.34)
Conclusions
RQ What is the relationship between the event and the record? Is the idea of evidence pivotal to the concept of "recordness"? Should evidence be privileged above all else in defining a record? What about remembering, forgetting, imagining?
Type
Journal
Title
Documenting digital images: Textual meta-data at the Blake Archive
The Electronic Library: The International Journal for Minicomputer, Microcomputer, and Software Applications in Libraries
Publication Year
1998
Volume
16
Issue
4
Pages
239
Critical Arguements
CA One of the critical issues in the future development of digital libraries is the provision of documentary metadata for non-textual electronic files.
Phrases
<P1> When libraries create digital image collections, however, documentation becomes more problematic. Even if an image is surrounded by a robust network of supporting materials, the functionality of client-server networks such as the World Wide Web permits the image to become detached from the documentary process." (p. 239)
Type
Journal
Title
Challenges for service providers when importing metadata in digital libraries
CA Problems in implementing metadata for online resource discovery, in this case for digital libraries, will not be solved simply by adopting a common schema. Intellectual property rights remain another major obstacle to be dealt with.
Phrases
RQ Under what circumstances can metadata be altered? How should the copyright information of a resource be distinguished from the copyright information of its metadata? Will an audit trail be used as metadata shared with other repositories?
This article provides an overview of evolving Australian records continuum theory and the records continuum model, which is interpreted as both a metaphor and a new worldview, representing a paradigm shift in Kuhn's sense. It is based on a distillation of research findings drawn from discourse, literary warrant and historical analysis, as well as case studies, participant observation and reflection. The article traces the emergence in Australia in the 1990s of a community of practice which has taken continuum rather than life cycle based perspectives, and adopted postcustodial approaches to recordkeeping and archiving. It "places" the evolution of records continuum theory and practice in Australia in the context of a larger international discourse that was reconceptualizing traditional theory, and "reinventing" records and archives practice.
Publisher
Kluwer Academic Publishers
Publication Location
The Netherlands
Critical Arguements
CA Looks at the development of the Australian community of practice that led to records continuum theory: an approach that, in contrast to the North American life cycle approach, sees recordkeeping and archival practices as part of the same continuum of activities. Since the 1990s, there has been a lively debate between proponents of these two different ways of thinking. The second part of the article is highly theoretical, situating records continuum theory in the larger intellectual trend toward postmodernism and postpositivism.
Phrases
<P1> The model was built on a unifying concept of records inclusive of archives, which are defined as records of continuing value. It also drew on ideas about the "fixed" and "mutable" nature of records, the notion that records are ÔÇ£always in a process of becoming." (p. 334). <P2> Continuum ideas about the nature of records and archives challenge traditional understandings which differentiate "archives" from "records" on the basis of selection for permanent preservation in archival custody, and which focus on their fixed nature. Adopting a pluralist view of recorded information, continuum thinking characterises records as a special genre of documents in terms of their intent and functionality. It emphasises their evidentiary, transactional and contextual nature, rejecting approaches to the definition of records which focus on their subject content and informational value. (p. 335) <P3> [R]ecordkeeping and archiving processes ... help to assure the accessibility of meaningful records for as long as they are of value to people, organisations, and societies ÔÇô whether that be for a nanosecond or millennia. (p. 336) <P4> [I]f North American understandings of the term record keeping, based on life cycle concepts of records management, are used to interpret the writings of members of the Australian recordkeeping community, there is considerable potential for misunderstanding. <P5> Members of the recordkeeping and archiving community have worked together, often in partnership with members of other records and archives communities, on a range of national policy and standards initiatives, particularly in response to the challenge of electronic recordkeeping. These collaborative efforts resulted in AS 4390, the Australian Standard: Records Management (1996), the Australian Council of Archives' Common Framework for Electronic Recordkeeping (1996), and the Australian Records and Archives Competency Standards (1997). In a parallel and interconnected development, individual archival organisations have been developing electronic recordkeeping policies, standards, system design methodologies, and implementation strategies for their jurisdictions, including the National Archives of Australia's suite of standards, policies, and guidelines under the e-permanence initiative launched in early 2000. These developments have been deliberately set within the broader context of national standards and policy development frameworks. Two of the lead institutions in these initiatives are the National Archives of Australia and the State Records Authority of New South Wales, which have based their work in this area on exploration of fundamental questions about the nature of records and archives, and the role of recordkeeping and archiving in society. <warrant> (p. 339) <P6> In adopting a continuum-based worldview and defining its "place" in the world, the Australian recordkeeping and archiving community consciously rejected the life cycle worldview that had dominated records management and archives practice in the latter half of the 20th century in North America. ... They were also strong advocates of the nexus between accountable recordkeeping and accountability in a democratic society, and supporters of the dual role of an archival authority as both a regulator of current recordkeeping, and preserver of the collective memory of the state/nation. (p. 343-344) <P7> [P]ost-modern ideas about records view them as dynamic objects that are fixed in terms of content and meaningful elements of their structure, but linked to ever-broadening layers of contextual metadata that manages their meanings, and enables their accessibility and useability as they move through "spacetime." (p. 349) <P8> In exploring the role of recordkeeping and archiving professionals within a postcustodial frame of reference, archival theorists such as Brothman, Brown, Cook, Harris, Hedstrom, Hurley, Nesmith, and Upward have concluded that they are an integral part of the record and archive making and keeping process, involved in society's remembering and forgetting. (p. 355) <P9> Writings on the societal context of functional appraisal have gone some way to translate into appraisal policies and strategies the implications of the shifts in perception away from seeing records managers as passive keepers of documentary detritus ... and archivists as Jenkinson's neutral, impartial custodians of inherited records. (p. 355-356)
Conclusions
RQ "By attempting to define, to categorise, pin down, and represent records and their contexts of creation, management, and use, descriptive standards and metadata schema can only ever represent a partial view of the dynamic, complex, and multi-dimensional nature of records, and their rich webs of contextual and documentary relationships. Within these limitations, what recordkeeping metadata research is reaching towards are ways to represent records and their contexts as richly and extensively as possible, to develop frameworks that recognise their mutable and contingent nature, as well as the role of recordkeeping and archiving professionals (records managers and archivists) in their creation and evolution, and to attempt to address issues relating to time and space." (p. 354)
Type
Journal
Title
Describing Records in Context in the Continuum: The Australian Recordkeeping Metadata Schema
CA RKMS is based on traditional recordkeeping thinking. However, it also looks to the future by viewing records as active agents of change, as intelligent information objects, which are supported by the metadata that RKMS' framework provides. Through RKMS, the dynamic world of business can be linked to the more passive world of cyberspace resource management.
Phrases
<P1> As long as records remain in the local domains in which they are created, a lot of broader contextual metadata is "in the air," carried in the minds of the corporate users of the records. When records move beyond the boundaries of the local domain in which they are created or, as is increasingly the case in networked environments, they are created in the first place in a global rather than a local domain, then this kind of metadata needs to be made explicit -- that is, captured and persistently linked to the record. This is essential so that users in the broader domain can uniquely identify, retrieve and understand the meanings of records. (p.7) <P2> The broader social context of the project is the need for individuals, society, government, and commerce to continually access the information they need to conduct their business, protect their rights and entitlements, and securely trace the trail of responsibility and action in distributed enterprises. ... Maintaining reliable, authentic and useable evidence of transactions through time and space has significant business, social, and cultural implications, as records provide essential evidence for purposes of governance, accountability, memory and identity. (p.6)
Conclusions
RQ There is a need to develop typologies of recordkeeping relationships such as agent to record and better ways to express them through metadata.
Type
Journal
Title
Structuring the Records Continuum Part Two: Structuration Theory and Recordkeeping
In the previous issue of Archives and Manuscripts I presented the first part of this two part exploration. It dealt with some possible meanings for 'post' in the term postcustodial. For archivists, considerations of custody are becoming more complex because of changing social, technical and legal considerations. These changes include those occurring in relation to access and the need to document electronic business communications reliably. Our actions, as archivists, in turn become more complex as we attempt to establish continuity of custody in electronic recordkeeping environments. In this part, I continue the case for emphasising the processes of archiving in both our theory and practice. The archives as a functional structure has dominated twentieth century archival discourse and institutional ordering, but we are going through a period of transformation. The structuration theory of Anthony Giddens is used to show that there are very different ways of theorising about our professional activities than have so far been attempted within the archival profession. Giddens' theory, at the very least, provides a useful device for gaining insights into the nature of theory and its relationship with practice. The most effective use of theory is as a way of seeing issues. When seen through the prism of structuration theory, the forming processes of the virtual archives are made apparent.
Critical Arguements
CA "This part of my exploration of the continuum will continue the case for understanding 'postcustodial' as a bookmark term for a major transition in archival practice. That transition involves leaving a long tradition in which continuity was a matter of sequential control. Electronic recordkeeping processes need to incorporate continuity into the essence of recordkeeping systems and into the lifespan of documents within those systems. In addressing this issue I will present a structurationist reading of the model set out in Part 1, using the sophisticated theory contained in the work of Anthony Giddens. Structuration theory deals with process, and illustrates why we must constantly re-assess and adjust the patterns for ordering our activities. It gives some leads on how to go about re-institutionalising these new patterns. When used in conjunction with continuum thinking, Giddens' meta-theory and its many pieces can help us to understand the complexities of the virtual archives, and to work our way towards the establishment of suitable routines for the control of document management, records capture, corporate memory, and collective memory."
Phrases
<P1> Broadly the debate has started to form itself as one between those who represent the structures and functions of an archival institution in an idealised form, and those who increasingly concentrate on the actions and processes which give rise to the record and its carriage through time and space. In one case the record needs to be stored, recalled and disseminated within our institutional frameworks; in the other case it is the processes for storing, recalling, and disseminating the record which need to be placed into a suitable framework. <P2> Structure, for Giddens, is not something separate from human action. It exists as memory, including the memory contained within the way we represent, recall, and disseminate resources including recorded information. <P3> Currently in electronic systems there is an absence of recordkeeping structures and disconnected dimensions. The action part of the duality has raced ahead of the structural one; the structuration process has only just begun. <P4> The continuum model's breadth and richness as a conceptual tool is expanded when it is seen that it can encompass action-structure issues in at least three specialisations within recordkeeping: contemporary recordkeeping - current recordkeeping actions and the structures in which they take place; regulatory recordkeeping - the processes of regulation and the enabling and controlling structures for action such as policies, standards, codes, legislation, and promulgation of best practices; historical recordkeeping - explorations of provenance in which action and structure are examined forensically as part of the data sought about records for their storage, recall and dissemination. <P5> The capacity to imbibe information about recordkeeping practices in agencies will be crucial to the effectiveness of the way archival 'organisations' set up their postcustodial programs. They will have to monitor the distribution and exercise of custodial responsibilities for electronic records from before the time of their creation. <warrant> <P6> As John McDonald has pointed out, recordkeeping activities need to occur at desktop level within systems that are not dependent upon the person at the desktop understanding all of the details of the operation of that system. <P7> Giddens' more recent work on reflexivity has many parallels with metadata approaches to recordkeeping. What if the records, as David Bearman predicts, can be self-managing? Will they be able to monitor themselves? <P8> He rejects the life cycle model in sociology, based on ritualised passages through life, and writes of 'open experience thresholds'. Once societies, for example, had rites for coming of age. Coming of age in a high modern society is now a complex process involving a host of experiences and risks which are very different to that of any previous generation. Open experience threshholds replace the life cycle thresholds, and as the term infers, are much less controlled or predictable. <P9> There is a clear parallel with recordkeeping in a high modern environment. The custodial thresholds can no longer be understood in terms of the spatial limits between a creating agency and an archives. The externalities of the archives as place will decline in significance as a means of directly asserting the authenticity and reliability of records. The complexities of modern recordkeeping involve many more contextual relationships and an ever increasing network of relationships between records and the actions that take place in relation to them. We have no need for a life cycle concept based on the premise of generational repetition of stages through which a record can be expected to pass. We have entered an age of more recordkeeping choices and of open experience thresholds. <P10> It is the increase in transactionality, and the technologies being used for those transactions, which are different. The solution, easier to write about than implement, is for records to parallel Giddens' high modern individual and make reflexive use of the broader social environment in which they exist. They can reflexively monitor their own action and, with encoding help from archivists and records managers, resolve their own crises as they arise. <warrant> <P11> David Bearman's argument that records can be self-managing goes well beyond the easy stage. It is supported by the Pittsburgh project's preliminary set of metadata specifications. The seeds of self-management can be found in object oriented programming, java, applets, and the growing understanding of the importance and nature of metadata. <P12> Continuum models further assist us to conceive of how records, as metadata encapsulated objects, can resolve many of their own life crises as they thread their way through time and across space. <P13> To be effective monitors of action, archival institutions will need to be recognised by others as the institutions most capable of providing guidance and control in relation to the integration of the archiving processes involved in document management, records capture, the organisation of corporate memory and the networking of archival systems. <warrant> <P14> Signification, in the theoretical domain, refers to our interpretative schemes and the way we encode and communicate our activities. At a macro level this includes language itself; at a micro level it can include our schemes for classification and ordering. <P15> The Pittsburgh project addressed the three major strands of Giddens' theoretical domain. It explored and set out functional requirements for evidence - signification. It sought literary warrants for archival tasks - legitimation. It reviewed the acceptability of the requirements for evidence within organisational cultures - domination. <P16> In Giddens' dimensional approach, the theoretical domain is re-defined to be about coding, organising our resources, and developing norms and standards. In this area the thinking has already begun to produce results, which leads this article in to a discussion of structural properties. <P17> Archivists deal with structural properties when, for example, they analyse the characteristics of recorded information such as the document, the record, the archive and the archives. The archives as a fortress is an observable structural property, as is the archives as a physical accumulation of records. Within Giddens' structuration theory, when archivists write about their favourite features, be they records or the archives as a place, they are discussing structural properties. <P18> Postcustodial practice in Australia is already beginning to put together a substantial array of structural properties. These developments are canvassed in the article by O'Shea and Roberts in the previous issue of Archives and Manuscripts. They include policies and strategies, standards, recordkeeping regimes, and what has come to be termed distributed custody. <P19> As [Terry] Eastwood comments in the same article, we do not have adequate electronic recordkeeping systems. Without them there can be no record in time-space to serve any form of accountability. <warrant> <P20> In the Pittsburgh project, for example, the transformation of recordkeeping processes is directed towards the creation and management of evidence, and possible elements of a valid rule-resource set have emerged. Elements can include the control of recordkeeping actions, accountability, the management of risk, the development of recordkeeping regimes, the establishment of recordkeeping requirements, and the specification of metadata. <P21> In a postcustodial approach it is the role of archival institutions to foster better recordkeeping practices within all the dimensions of recordkeeping. <warrant>
Conclusions
RQ "Best practice in the defence of the authoritative qualities of records can no longer be viewed as a linear chain, and the challenge is to establish new ways of legitimating responsibilities for records storage and custody which recognise the shifts which have occurred." ... "The recordkeeping profession should seek to establish itself as ground cover, working across terrains rather than existing tree-like in one spot. Beneath the ground cover there are shafts of specialisation running both laterally and vertically. Perhaps we can, as archivists, rediscover something that a sociologist like Giddens has never forgotten. Societies, including their composite parts, are the ultimate containers of recorded information. As a place in society, as Terry Cook argues, the archives is a multiple reality. We can set in train policies and strategies that can help generate multiplicity without losing respect for particular mine shafts. Archivists have an opportunity to pursue policies which encourage the responsible exercising of a custodial role throughout society, including the professions involved in current, regulatory and historical recordkeeping. If we take up that opportunity, our many goals can be better met and our concerns will be addressed more effectively."
SOW
DC "Frank Upward is a senior lecturer in the Department of Librarianship, Archives and Records at Monash University. He is an historian of the ideas contained in the Australian records continuum approach, and an ex- practitioner within that approach." ... "These two articles, and an earlier one on Ian Maclean and the origins of Australian continuum thinking, have not, so far, contained appropriate acknowledgements. David Bearman provided the necessary detonation of certain archival practices, and much more. Richard Brown and Terry Cook drew my attention to Anthony Giddens' work and their own work has helped shape my views. I have many colleagues at Monash who encourage my eccentricities. Sue McKemmish has helped shape my ideas and my final drafts and Barbara Reed has commented wisely on my outrageous earlier drafts. Livia Iacovino has made me stop and think more about the juridical tradition in recordkeeping. Chris Hurley produced many perspectives on the continuum during the 1996 seminars which have helped me see the model more fully. Don Schauder raised a number of key questions about Giddens as a theorist. Bruce Wearne of the Sociology Department at Monash helped me lift the clarity of my sociological explanations and made me realise how obsessed Giddens is with gerunds. The structural-functionalism of Luciana Duranti and Terry Eastwood provided me with a counterpoint to many of my arguments, but I also owe them debts for their respective explorations of recordkeeping processes and the intellectual milieu of archival ideas, and for their work on the administrative-juridical tradition of recordkeeping. Glenda Acland has provided perceptive comments on my articles - and supportive ones, for which I am most grateful given how different the articles are from conventional archival theorising. Australian Archives, and its many past and present staff members, has been important to me."
Type
Journal
Title
Structuring the Records Continuum Part One: Post-custodial principles and properties
The records continuum is becoming a much used term, but has seldom been defined in ways which show it is a time/space model not a life of the records model. Dictionary definitions of a continuum describe such features as its continuity, the indescernibility of its parts, and the way its elements pass into each other. Precise definitions, accordingly, have to discern the indiscernible, identify points that are not distinct, and do so in ways which accomodate the continuity of change. This article, and a second part to be published in the next volume, will explore the continuum in time/space terms supported by a theoretical mix of archival science, postmodernity and the 'structuration theory' of Anthony Giddens. In this part the main objectives are to give greater conceptual firmness to the continuum; to clear the way for broader considerations of the nature of the continuum by freeing archivists from the need to debate custody; to show how the structural principles for archival practice are capable of different expression without losing contact with something deeper that can outlive the manner of expression.
Critical Arguements
CA "This is the first instalment of a two part article exploring the records continuum. Together the articles will build into a theory about the constitution of the virtual archives. In this part I will examine what it can mean to be 'postcustodial', outline some possible structural principles for the virtual archives, and present a logical model for the records continuum." ... "In what follows in the remainder of this article (and all of the next) , I will explore the relevance of [Anthony] Giddens' theory to the structuring of the records continuum."
Phrases
<P1> If the archival profession is to avoid a fracture along the lines of paper and electronic media, it has to be able to develop ways of expressing its ideas in models of relevance to all ages of recordkeeping, but do so in ways which are contemporaneous with our own society. <warrant> <P2> We need more of the type of construct provided by the Pittsburgh Project's functional requirements for evidence which are 'high modern' but can apply to recordkeeping over time. <P3> What is essential is for electronic records to be identified, controlled and accessible for as long as they have value to Government and the Community. <warrant> <P4> We have to face up to the complexification of ownership, possession, guardianship and control within our legal system. Even possession can be broken down into into physical possession and constructed possession. We also have to face the potential within our technology for ownership, possession, custody or control to be exercised jointly by the archives, the organisation creating the records, and auditing agencies. The complexity requires a new look at our way of allocating authorities and responsibilities. <P5> In what has come to be known as the continuum approach Maclean argued that archivists should base their profession upon studies of the characteristics of recorded information, recordkeeping systems, and classification (the way the records were ordered within recordkeeping systems and the way these were ordered through time). <P6> A significant role for today's archival institution is to help to identify and establish functional requirements for recordkeeping that enable a more systematic approach to authentication than that provided by physical custody. <warrant> <P7> In an electronic work environment it means, in part, that the objectivity, understandability, availability, and usability of records need to be inherent in the way that the record is captured. In turn the documents need to be captured in the context of the actions of which they are part, and are recursively involved. <warrant> <P8>A dimensional analysis can be constructed from the model and explained in a number of ways including a recordkeeping system reading. When the co-ordinates of the continuum model are connected, the different dimensions of a recordkeeping system are revealed. The dimensions are not boundaries, the co-ordinates are not invariably present, and things may happen simultaneously across dimensions, but no matter how a recordkeeping system is set up it can be analysed in terms such as: first dimensional analysis: a pre- communication system for document creation within electronic systems [creating the trace]; second dimensional analysis: a post- communication system, for example traditional registry functionality which includes registration, the value adding of data for linking documents and disseminating them, and the maintenance of the record including disposition data [capturing trace as record]; third dimensional analysis: a system involving building, recalling and disseminating corporate memory [organising the record as memory]; fourth dimensional analysis: a system for building, recalling and disseminating collective memory (social, cultural or historical) including information of the type required for an archival information system [pluralizing the memory]. <P9> In the high modern recordkeeping environment of the 1990's a continuum has to take into account a different array of recordkeeping tools. These tools, plucking a few out at random but ordering the list dimensionally, include: document management software, Australian records system software, the intranet and the internet. <P10> In terms of a records continuum which supports an evidence based recordkeeping approach, the second dimension is crucial. This is where the document is disembedded from the immediate contexts of the first dimension. It is this disembedding process that gives the record its value as a 'symbolic token'. A document is embedded in an act, but the document as a record needs to be validatable using external reference points. These points include the operation of the recordkeeping system into which it was received, and information pertaining to the technical, social (including business) and communication processes of which the document was part.
Conclusions
RQ "Postcustodial approaches to archives and records cannot be understood if they are treated as a dualism. They are not the opposite of custody. They are a response to opportunities for asserting the role of an archives - and not just its authentication role - in many re-invigorating ways, a theme which I will explore further in the next edition of Archives and Manuscripts."
SOW
DC "Frank Upward is a senior lecturer in the Department of Librarianship, Archives and Records at Monash University. He is an historian of the ideas contained in the Australian records continuum approach, and an ex-practitioner within that approach."
CA Strength of a metadata structure lies in its ability to layer and exchange information from a wide variety of creators in a "loosely coupled system of organization." This review covers metadata literature from mid 1996 through early 1998. It focuses on the development and application of metadata standards used by the LIS community for resource description, discovery and retrieval within a digital environment.
Annual Review of Information Science and technology
Periodical Abbreviation
ARIST
Publication Year
2001
Volume
35
Pages
337
Critical Arguements
CA Yakel gives an overview of the literature on digital preservation from the early 1980s through 2000.
Phrases
<P1> The immediate question is whether the industry will adopt these standards, strategies, and functional elements to create evidence-based recordkeeping systems that ensure their authenticity and reliability. If the model of the DOD guidelines is any indication, some sectors of the vendor population will respond to recordkeeping specifications if there is sufficient customer leverage. Recordkeeping requirements rely on various metadata schemes and the viability of standards. (p.366) <warrant>
Conclusions
RQ More research into the hybrid approach (emulation and migration) is needed to determine criteria that will ensure the preservation of authenticity and reliability.
Type
Electronic Journal
Title
Keeping Memory Alive: Practices for Preserving Digital Content at the National Digital Library Program of the Library of Congress
CA An overview of the major issues and initiatives in digital preservation at the Library of Congress. "In the medium term, the National Digital Library Program is focusing on two operational approaches. First, steps are taken during conversion that are likely to make migration or emulation less costly when they are needed. Second, the bit streams generated by the conversion process are kept alive through replication and routine refreshing supported by integrity checks. The practices described here provide examples of how those steps are implemented to keep the content of American Memory alive."
Phrases
<P1> The practices described here should not be seen as policies of the Library of Congress; nor are they suggested as best practices in any absolute sense. NDLP regards them as appropriate practices based on real experience, the nature and content of the originals, the primary purposes of the digitization, the state of technology, the availability of resources, the scale of the American Memory digital collection, and the goals of the program. They cover not just the storage of content and associated metadata, but also aspects of initial capture and quality review that support the long-term retention of content digitized from analog sources. <P2> The Library recognizes that digital information resources, whether born digital or converted from analog forms, should be acquired, used, and served alongside traditional resources in the same format or subject area. Such responsibility will include ensuring that effective access is maintained to the digital content through American Memory and via the Library's main catalog and, in coordination with the units responsible for the technical infrastructure, planning migration to new technology when needed. <P3> Refreshing can be carried out in a largely automated fashion on an ongoing basis. Migration, however, will require substantial resources, in a combination of processing time, out-sourced contracts, and staff time. Choice of appropriate formats for digital masters will defer the need for large-scale migration. Integrity checks and appropriate capture of metadata during the initial capture and production process will reduce the resource requirements for future migration steps. <warrant> We can be certain that migration of content to new data formats will be necessary at some point. The future will see industrywide adoption of new data formats with functional advantages over current standards. However, it will be difficult to predict exactly which metadata will be useful to support migration, when migration of master formats will be needed, and the nature and extent of resource needs. Human experts will need to decide when to undertake migration and develop tools for each migration step. <P4> Effective preservation of resources in digital form requires (a) attention early in the life-cycle, at the moment of creation, publication, or acquisition and (b) ongoing management (with attendant costs) to ensure continuing usability. <P5> The National Digital Library Program has identified several categories of metadata needed to support access and management for digital content. Descriptive metadata supports discovery through search and browse functions. Structural metadata supports presentation of complex objects by representing relationships between components, such as sequences of images. In addition, administrative metadata is needed to support management tasks, such as access control, archiving, and migration. Individual metadata elements may support more than one function, but the categorization of elements by function has proved useful. <P6> It has been recognized that metadata representations appropriate for manipulation and long-term retention may not always be appropriate for real-time delivery. <P7> It has also been realized that some basic descriptive metadata (at the very least a title or brief description) should be associated with the structural and administrative metadata. <P8> During 1999, an internal working group reviewed past experience and prototype exercises and compiled a core set of metadata elements that will serve the different functions identified. This set will be tested and refined as part of pilot activities during 2000. <P9> Master formats are well documented and widely deployed, preferably formal standards and preferably non-proprietary. Such choices should minimize the need for future migration or ensure that appropriate and affordable tools for migration will be developed by the industry. <warrant>
Conclusions
RQ "Developing long-term strategies for preserving digital resources presents challenges associated with the uncertainties of technological change. There is currently little experience on which to base predictions of how often migration to new formats will be necessary or desirable or whether emulation will prove cost-effective for certain categories of resources. ... Technological advances, while sure to present new challenges, will also provide new solutions for preserving digital content."
Type
Electronic Journal
Title
A Spectrum of Interoperability: The Site for Science Prototype for the NSDL
"Currently, NSF is funding 64 projects, each making its own contribution to the library, with a total annual budget of about $24 million. Many projects are building collections; others are developing services; a few are carrying out targeted research.The NSDL is a broad program to build a digital library for education in science, mathematics, engineering and technology. It is funded by the National Science Foundation (NSF) Division of Undergraduate Education. . . . The Core Integration task is to ensure that the NSDL is a single coherent library, not simply a set of unrelated activities. In summer 2000, the NSF funded six Core Integration demonstration projects, each lasting a year. One of these grants was to Cornell University and our demonstration is known as Site for Science. It is at http://www.siteforscience.org/ [Site for Science]. In late 2001, the NSF consolidated the Core Integration funding into a single grant for the production release of the NSDL. This grant was made to a collaboration of the University Corporation for Atmospheric Research (UCAR), Columbia University and Cornell University. The technical approach being followed is based heavily on our experience with Site for Science. Therefore this article is both a description of the strategy for interoperability that was developed for Site for Science and an introduction to the architecture being used by the NSDL production team."
ISBN
1082-9873
Critical Arguements
CA "[T]his article is both a description of the strategy for interoperability that was developed for the [Cornell University's NSF-funded] Site for Science and an introduction to the architecture being used by the NSDL production team."
Phrases
<P1> The grand vision is that the NSDL become a comprehensive library of every digital resource that could conceivably be of value to any aspect of education in any branch of science and engineering, both defined very broadly. <P2> Interoperability among heterogeneous collections is a central theme of the Core Integration. The potential collections have a wide variety of data types, metadata standards, protocols, authentication schemes, and business models. <P3> The goal of interoperability is to build coherent services for users, from components that are technically different and managed by different organizations. This requires agreements to cooperate at three levels: technical, content and organizational. <P4> Much of the research of the authors of this paper aims at . . . looking for approaches to interoperability that have low cost of adoption, yet provide substantial functionality. One of these approaches is the metadata harvesting protocol of the Open Archives Initiative (OAI) . . . <P5> For Site for Science, we identified three levels of digital library interoperability: Federation; Harvesting; Gathering. In this list, the top level provides the strongest form of interoperability, but places the greatest burden on participants. The bottom level requires essentially no effort by the participants, but provides a poorer level of interoperability. The Site for Science demonstration concentrated on the harvesting and gathering, because other projects were exploring federation. <P6> In an ideal world all the collections and services that the NSDL wishes to encompass would support an agreed set of standard metadata. The real world is less simple. . . . However, the NSDL does have influence. We can attempt to persuade collections to move along the interoperability curve. <warrant> <P7> The Site for Science metadata strategy is based on two principles. The first is that metadata is too expensive for the Core Integration team to create much of it. Hence, the NSDL has to rely on existing metadata or metadata that can be generated automatically. The second is to make use of as much of the metadata available from collections as possible, knowing that it varies greatly from none to extensive. Based on these principles, Site for Science, and subsequently the entire NSDL, developed the following metadata strategy: Support eight standard formats; Collect all existing metadata in these formats; Provide crosswalks to Dublin Core; Assemble all metadata in a central metadata repository; Expose all metadata records in the repository for service providers to harvest; Concentrate limited human effort on collection-level metadata; Use automatic generation to augment item-level metadata. <P8> The strategy developed by Site for Science and now adopted by the NSDL is to accumulate metadata in the native formats provided by the collections . . . If a collection supports the protocols of the Open Archives Initiative, it must be able to supply unqualified Dublin Core (which is required by the OAI) as well as the native metadata format. <P9> From a computing viewpoint, the metadata repository is the key component of the Site for Science system. The repository can be thought of as a modern variant of the traditional library union catalog, a catalog that holds comprehensive catalog records from a group of libraries. . . . Metadata from all the collections is stored in the repository and made available to providers of NSDL service.
Conclusions
RQ 1 "Can a small team of librarians manage the collection development and metadata strategies for a very large library?" RQ 2 "Can the NSDL actually build services that are significantly more useful than the general web search services?"
CA Through OAI, access to resources is effected in a low-cost, interoperable manner.
Phrases
<P1> The need for a metadata format that would support both metadata creation by authors and interoperability across heterogeneous repositories led to the choice of unqualified Dublin Core. (p.16) <P2> OAI develops and promotes a low-barrier interoperability framework and associated standards, originally to enhance access to e-print archives, but now taking into account access to other digital materials. (p.16)
Conclusions
RQ The many players involved in cultural heritage need to work together to define standards and best practices.
Type
Electronic Journal
Title
Metadata Corner: Working Meeting on Electronic Records Research
CA Just as the digital library has forced librarians to rethink their profession, so has the electronic record done the same for archivists and recordkeepers. Much of the debate centers around what e-records are and how to deal with them.
Phrases
<P1> Their presentations at the Working Meeting elaborated on the concept of "literary warrant," which can be defined as the mandate from outside the archives profession -- from law, professional best practices and other special sources -- which requires the creation and maintenance of records. (p.1) <P2> Systems that records professionals devise to maintain these concepts over time, may, therefore, be of interest to information professionals interested in digital preservation. Authenticity, indeed, is a key component of Peter Graham's description of "intellectual preservation." (p.3)
Conclusions
RQ How can metadata be linked to its record over time? How can we ensure the "least-loss" migration of metadata over time? Collaboration with warrant creators from other fields such as lawyers and auditors is desirable.
CA Metadata is a key part of the information infrastructure necessary to organize and classify the massive amount of information on the Web. Metadata, just like the resources they describe, will range in quality and be organized around different principles. Modularity is critical to allow metadata schema designers to base their new creations on established schemas, thereby benefiting from best practices rather than reinventing elements each time. Extensibility and cost-effectiveness are also important factors. Controlled vocabularies provide greater precision and access. Multilingualism (translating specification documents into many languages) is an important step in fostering global metadata architecture(s).
Phrases
<P1> The use of controlled vocabularies is another important approach to refinement that improves the precision for descriptions and leverages the substantial intellectual investment made by many domains to improve subject access. (p.4) <P2> Standards typically deal with these issues through the complementary processes of internalization and localization: the former process relates to the creation of "neutral" standards, whereas the latter refers to the adaptation of such a neutral standard to a local context. (p.4)
Conclusions
RQ In order for the full potential of resource discovery that the Web could offer to be realized, a"convergence" of standards and semantics must occur.
Type
Electronic Journal
Title
Primary Sources, Research, and the Internet: The Digital Scriptorium at Duke
First Monday, Peer Reviewed Journal on the Internet
Publication Year
1997
Volume
2
Issue
9
Critical Arguements
CA "As the digital revolution moves us ever closer to the idea of the 'virtual library,' repositories of primary sources and other archival materials have both a special opportunity and responsibility. Since the materials in their custody are, by definition, often unique, these institutions will need to work very carefully with scholars and other researchers to determine what is the most effective way of making this material accessible in a digital environment."
Phrases
<P1> The matter of Internet access to research materials and collections is not one of simply doing what we have always done -- except digitally. It represents instead an opportunity to rethink the fundamental triangular relationship between libraries and archives, their collections, and their users. <P2> Digital information as it exists on the Internet today requires more navigational, contextual, and descriptive data than is currently provided in traditional card catalogs or their more modern electronic equivalent. One simply cannot throw up vast amounts of textual or image-based data onto the World Wide Web and expect existing search engines to make much sense of it or users to be able to digest the results. ... Archivists and manuscript curators have for many years now been providing just that sort of contextual detail in the guides, finding aids, and indexes that they have traditionally prepared for their holdings. <P3> Those involved in the Berkeley project understood that HTML was essentially a presentational encoding scheme and lacked the formal structural and content-based encoding that SGML would offer. <P4> Encoded Archival Description is quickly moving towards become an internationally embraced standard for the encoding of archival metadata in a wide variety of archival repositories and special collections libraries. And the Digital Scriptorium at Duke has become one of the early implementors of this standard. <warrant>
Conclusions
RQ "Duke is currently involved in a project that is funded through NEH and also involves the libraries of Stanford, the University of Virginia, and the University of California-Berkeley. This project (dubbed the "American Heritage Virtual Digital Archives Project") will create a virtual archive of encoded finding aids from all four institutions. This archive will permit seamless searching of these finding aids -- at a highly granular level of detail -- through a single search engine on one site and will, it is hoped, provide a model for a more comprehensive national system in the near future."
Type
Electronic Journal
Title
The Warwick Framework: A container architecture for diverse sets of metadata
This paper is a abbreviated version of The Warwick Framework: A Container Architecture for Aggregating Sets of Metadata. It describes a container architecture for aggregating logically, and perhaps physically, distinct packages of metadata. This "Warwick Framework" is the result of the April 1996 Metadata II Workshop in Warwick U.K.
ISBN
1082-9873
Critical Arguements
CA Describes the Warwick Framework, a proposal for linking together the various metadata schemes that may be attached to a given information object by using a system of "packages" and "containers." "[Warwick Workshop] attendees concluded that ... the route to progress on the metadata issue lay in the formulation a higher-level context for the Dublin Core. This context should define how the Core can be combined with other sets of metadata in a manner that addresses the individual integrity, distinct audiences, and separate realms of responsibility of these distinct metadata sets. The result of the Warwick Workshop is a container architecture, known as the Warwick Framework. The framework is a mechanism for aggregating logically, and perhaps physically, distinct packages of metadata. This is a modularization of the metadata issue with a number of notable characteristics. It allows the designers of individual metadata sets to focus on their specific requirements, without concerns for generalization to ultimately unbounded scope. It allows the syntax of metadata sets to vary in conformance with semantic requirements, community practices, and functional (processing) requirements for the kind of metadata in question. It separates management of and responsibility for specific metadata sets among their respective "communities of expertise." It promotes interoperability by allowing tools and agents to selectively access and manipulate individual packages and ignore others. It permits access to the different metadata sets that are related to the same object to be separately controlled. It flexibly accommodates future metadata sets by not requiring changes to existing sets or the programs that make use of them."
Phrases
<P1> The range of metadata needed to describe and manage objects is likely to continue to expand as we become more sophisticated in the ways in which we characterize and retrieve objects and also more demanding in our requirements to control the use of networked information objects. The architecture must be sufficiently flexible to incorporate new semantics without requiring a rewrite of existing metadata sets. <warrant> <P2> Each logically distinct metadata set may represent the interests of and domain of expertise of a specific community. <P3> Just as there are disparate sources of metadata, different metadata sets are used by and may be restricted to distinct communities of users and agents. <P4> Strictly partitioning the information universe into data and metadata is misleading. <P5> If we allow for the fact that metadata for an object consists of logically distinct and separately administered components, then we should also provide for the distribution of these components among several servers or repositories. The references to distributed components should be via a reliable persistent name scheme, such as that proposed for Universal Resources Names (URNs) and Handles. <P6> [W]e emphasize that the existence of a reliable URN implementation is a necessary to avoid the problems of dangling references that plague the Web. <warrant> <P7> Anyone can, in fact, create descriptive data for a networked resource, without permission or knowledge of the owner or manager of that resource. This metadata is fundamentally different from that metadata that the owner of a resource chooses to link or embed with the resource. We, therefore, informally distinguish between two categories of metadata containers, which both have the same implementation [internally referenced and externally referenced metadata containers].
Conclusions
RQ "We run the danger, with the full expressiveness of the Warwick Framework, of creating such complexity that the metadata is effectively useless. Finding the appropriate balance is a central design problem. ... Definers of specific metadata sets should ensure that the set of operations and semantics of those operations will be strictly defined for a package of a given type. We expect that a limited set of metadata types will be widely used and 'understood' by browsers and agents. However, the type system must be extensible, and some method that allows existing clients and agents to process new types must be a part of a full implementation of the Framework. ... There is a need to agree on one or more syntaxes for the various metadata sets. Even in the context of the relatively simple World Wide Web, the Internet is often unbearably slow and unreliable. Connections often fail or time out due to high load, server failure, and the like. In a full implementation of the Warwick Framework, access to a "document" might require negotiation across distributed repositories. The performance of this distributed architecture is difficult to predict and is prone to multiple points of failure. ... It is clear that some protocol work will need to be done to support container and package interchange and retrieval. ... Some examination of the relationship between the Warwick Framework and ongoing work in repository architectures would likely be fruitful.
Type
Electronic Journal
Title
Collection-Based Persistent Digital Archives - Part 1
The preservation of digital information for long periods of time is becoming feasible through the integration of archival storage technology from supercomputer centers, data grid technology from the computer science community, information models from the digital library community, and preservation models from the archivistÔÇÖs community. The supercomputer centers provide the technology needed to store the immense amounts of digital data that are being created, while the digital library community provides the mechanisms to define the context needed to interpret the data. The coordination of these technologies with preservation and management policies defines the infrastructure for a collection-based persistent archive. This paper defines an approach for maintaining digital data for hundreds of years through development of an environment that supports migration of collections onto new software systems.
ISBN
1082-9873
Critical Arguements
CA "Supercomputer centers, digital libraries, and archival storage communities have common persistent archival storage requirements. Each of these communities is building software infrastructure to organize and store large collections of data. An emerging common requirement is the ability to maintain data collections for long periods of time. The challenge is to maintain the ability to discover, access, and display digital objects that are stored within an archive, while the technology used to manage the archive evolves. We have implemented an approach based upon the storage of the digital objects that comprise the collection, augmented with the meta-data attributes needed to dynamically recreate the data collection. This approach builds upon the technology needed to support extensible database schema, which in turn enables the creation of data handling systems that interconnect legacy storage systems."
Phrases
<P1> The ultimate goal is to preserve not only the bits associated with the original data, but also the context that permits the data to be interpreted. <warrant> <P2> We rely on the use of collections to define the context to associate with digital data. The context is defined through the creation of semi-structured representations for both the digital objects and the associated data collection. <P3>A collection-based persistent archive is therefore one in which the organization of the collection is archived simultaneously with the digital objects that comprise the collection. <P4> The goal is to preserve digital information for at least 400 years. This paper examines the technical issues that must be addressed and presents a prototype implementation. <P5>Digital object representation. Every digital object has attributes that define its structure, physical context, and provenance, and annotations that describe features of interest within the object. Since the set of attributes (such as annotations) will vary across all objects within a collection, a semi-structured representation is needed. Not all digital objects will have the same set of associated attributes. <P6> If possible, a common information model should be used to reference the attributes associated with the digital objects, the collection organization, and the presentation interface. An emerging standard for a uniform data exchange model is the eXtended Markup Language (XML). <P7> A particular example of an information model is the XML Document Type Definition (DTD) which provides a description for the allowed nesting structure of XML elements. Richer information models are emerging such as XSchema (which provides data types, inheritance, and more powerful linking mechanisms) and XMI (which provides models for multiple levels of data abstraction). <P8> Although XML DTDs were originally applied to documents only, they are now being applied to arbitrary digital objects, including the collections themselves. More generally, OSDs can be used to define the structure of digital objects, specify inheritance properties of digital objects, and define the collection organization and user interface structure. <P9> A persistent collection therefore needs the following components of an OSD to completely define the collection context: Data dictionary for collection semantics; Digital object structure; Collection structure; and User interface structure. <P10> The re-creation or instantiation of the data collection is done with a software program that uses the schema descriptions that define the digital object and collection structure to generate the collection. The goal is to build a generic program that works with any schema description. <P11> The information for which driver to use for access to a particular data set is maintained in the associated Meta-data Catalog (MCAT). The MCAT system is a database containing information about each data set that is stored in the data storage systems. <P12> The data handling infrastructure developed at SDSC has two components: the SDSC Storage Resource Broker (SRB) that provides federation and access to distributed and diverse storage resources in a heterogeneous computing environment, and the Meta-data Catalog (MCAT) that holds systemic and application or domain-dependent meta-data about the resources and data sets (and users) that are being brokered by the SRB. <P13> A client does not need to remember the physical mapping of a data set. It is stored as meta-data associated with the data set in the MCAT catalog. <P14> A characterization of a relational database requires a description of both the logical organization of attributes (the schema), and a description of the physical organization of attributes into tables. For the persistent archive prototype we used XML DTDs to describe the logical organization. <P15> A combination of the schema and physical organization can be used to define how queries can be decomposed across the multiple tables that are used to hold the meta-data attributes. <P16> By using an XML-based database, it is possible to avoid the need to map between semi-structured and relational organizations of the database attributes. This minimizes the amount of information needed to characterize a collection, and makes the re-creation of the database easier. <warrant> <P17> Digital object attributes are separated into two classes of information within the MCAT: System-level meta-data that provides operational information. These include information about resources (e.g., archival systems, database systems, etc., and their capabilities, protocols, etc.) and data objects (e.g., their formats or types, replication information, location, collection information, etc.); Application-dependent meta-data that provides information specific to particular data sets and their collections (e.g., Dublin Core values for text objects). <P18> Internally, MCAT keeps schema-level meta-data about all of the attributes that are defined. The schema-level attributes are used to define the context for a collection and enable the instantiation of the collection on new technology. <P19> The logical structure should not be confused with database schema and are more general than that. For example, we have implemented the Dublin Core database schema to organize attributes about digitized text. The attributes defined in the logical structure that is associated with the Dublin Core schema contains information about the subject, constraints, and presentation formats that are needed to display the schema along with information about its use and ownership. <P20> The MCAT system supports the publication of schemata associated with data collections, schema extension through the addition or deletion of new attributes, and the dynamic generation of the SQL that corresponds to joins across combinations of attributes. <P21> By adding routines to access the schema-level meta-data from an archive, it is possible to build a collection-based persistent archive. As technology evolves and the software infrastructure is replaced, the MCAT system can support the migration of the collection to the new technology.
Conclusions
RQ Collection-Based Persistent Digital Archives - Part 2
SOW
DC "The technology proposed by SDSC for implementing persistent archives builds upon interactions with many of these groups. Explicit interactions include collaborations with Federal planning groups, the Computational Grid, the digital library community, and individual federal agencies." ... "The data management technology has been developed through multiple federally sponsored projects, including the DARPA project F19628-95-C-0194 "Massive Data Analysis Systems," the DARPA/USPTO project F19628-96-C-0020 "Distributed Object Computation Testbed," the Data Intensive Computing thrust area of the NSF project ASC 96-19020 "National Partnership for Advanced Computational Infrastructure," the NASA Information Power Grid project, and the DOE ASCI/ASAP project "Data Visualization Corridor." Additional projects related to the NSF Digital Library Initiative Phase II and the California Digital Library at the University of California will also support the development of information management technology. This work was supported by a NARA extension to the DARPA/USPTO Distributed Object Computation Testbed, project F19628-96-C-0020."
Type
Electronic Journal
Title
Collection-Based Persistent Digital Archives - Part 2
"Collection-Based Persistent Digital Archives: Part 2" describes the creation of a one million message persistent E-mail collection. It discusses the four major components of a persistent archive system: support for ingestion, archival storage, information discovery, and presentation of the collection. The technology to support each of these processes is still rapidly evolving, and opportunities for further research are identified.
ISBN
1082-9873
Critical Arguements
CA "The multiple migration steps can be broadly classified into a definition phase and a loading phase. The definition phase is infrastructure independent, whereas the loading phase is geared towards materializing the processes needed for migrating the objects onto new technology. We illustrate these steps by providing a detailed description of the actual process used to ingest and load a million-record E-mail collection at the San Diego Supercomputer Center (SDSC). Note that the SDSC processes were written to use the available object-relational databases for organizing the meta-data. In the future, it may be possible to go directly to XML-based databases."
Phrases
<P1> The processes used to ingest a collection, transform it into an infrastructure independent form, and store the collection in an archive comprise the persistent storage steps of a persistent archive. The processes used to recreate the collection on new technology, optimize the database, and recreate the user interface comprise the retrieval steps of a persistent archive. <P2> In order to build a persistent collection, we consider a solution that "abstracts" all aspects of the data and its preservation. In this approach, data object and processes are codified by raising them above the machine/software dependent forms to an abstract format that can be used to recreate the object and the processes in any new desirable forms. <P3> The SDSC infrastructure uses object-relational databases to organize information. This makes data ingestion more complex by requiring the mapping of the XML DTD semi-structured representation onto a relational schema. <P4> The SDSC infrastructure uses object-relational databases to organize information. This makes data ingestion more complex by requiring the mapping of the XML DTD semi-structured representation onto a relational schema. <P5> The steps used to store the persistent archive were: (1) Define Digital Object: define meta-data, define object structure (OBJ-DTD) --- (A), define object DTD to object DDL mapping --- (B) (2) Define Collection: define meta-data, define collection structure (COLL-DTD) --- (C), define collection DTD structure to collection DDL mapping --- (D) (3) Define Containers: define packing format for encapsulating data and meta-data (examples are the AIP standard, Hierarchical Data Format, Document Type Definition) <P5> In the ingestion phase, the relational and semi-structured organization of the meta-data is defined. No database is actually created, only the mapping between the relational organization and the object DTD. <P6> Note that the collection relational organization does not have to encompass all of the attributes that are associated with a digital object. Separate information models are used to describe the objects and the collections. It is possible to take the same set of digital objects and form a new collection with a new relational organization. <P7> Multiple communities across academia, the federal government, and standards groups are exploring strategies for managing very large archives. The persistent archive community needs to maintain interactions with these communities to track development of new strategies for data management and storage. <warrant> <P8>
Conclusions
RQ "The four major components of the persistent archive system are support for ingestion, archival storage, information discovery, and presentation of the collection. The first two components focus on the ingestion of data into collections. The last two focus on access to the resulting collections. The technology to support each of these processes is still rapidly evolving. Hence consensus on standards has not been reached for many of the infrastructure components. At the same time, many of the components are active areas of research. To reach consensus on a feasible collection-based persistent archive, continued research and development is needed. Examples of the many related issues are listed below:
Type
Electronic Journal
Title
Search for Tomorrow: The Electronic Records Research Program of the U.S. National Historical Publications and Records Commission
The National Historical Publications and Records Commission (NHPRC) is a small grant-making agency affiliated with the U.S. National Archives and Records Administration. The Commission is charged with promoting the preservation and dissemination of documentary source materials to ensure an understanding of U.S. history. Recognizing that the increasing use of computers created challenges for preserving the documentary record, the Commission adopted a research agenda in 1991 to promote research and development on the preservation and continued accessibility of documentary materials in electronic form. From 1991 to the present the Commission awarded 31 grants totaling $2,276,665 for electronic records research. Most of this research has focused on two issues of central concern to archivists: (1) electronic record keeping (tools and techniques to manage electronic records produced in an office environment, such as word processing documents and electronic mail), and (2) best practices for storing, describing, and providing access to all electronic records of long-term value. NHPRC grants have raised the visibility of electronic records issues among archivists. The grants have enabled numerous archives to begin to address electronic records problems, and, perhaps most importantly, they have stimulated discussion about electronic records among archivists and records managers.
Publisher
Elsevier Science Ltd
Critical Arguements
CA "The problem of maintaining electronic records over time is big, expensive, and growing. A task force on digital archives established by the Commission on Preservation and Access in 1994 commented that the life of electronic records could be characterized in the same words Thomas Hobbes once used to describe life: ÔÇ£nasty, brutish, and shortÔÇØ [1]. Every day, thousands of new electronic files are created on federal, state, and local government computers across the nation. A small but important portion of these records will be designated for permanent retention. Government agencies are increasingly relying on computers to maintain information such as census files, land titles, statistical data, and vital records. But how should electronic records with long-term value be maintained? Few government agencies have developed comprehensive policies for managing current electronic records, much less preserving those with continuing value for historians and other researchers. Because of this serious and growing problem, the National Historical Publications and Records Commission (NHPRC), a small grantmaking agency affiliated with the U.S. National Archives and Records Administration (NARA), has been making grants for research and development on the preservation and use of electronic documentary sources. The program is conducted in concert with NARA, which in 1996 issued a strategic plan that gives high priority to mastering electronic records problems in partnership with federal government agencies and the NHPRC.
Phrases
<P1> How can data dictionaries, information resource directory systems, and other metadata systems be used to support electronic records management and archival requirements? <P2> In spite of the number of projects the Commission has supported, only four questions from the research agenda have been addressed to date. Of these, the question relating to requirements for the development of data dictionaries and other metadata systems (question number four) has produced a single grant for a state information locator system in South Carolina, and the question relating to needs for archival education (question 10) has led to two grants to the Society of American Archivists for curricular materials. <P3> Information systems created without regard for these considerations may have deficiencies that limit the usefulness of the records contained on them. <warrant> <P4> The NHPRC has awarded major grants to four institutions over the past five years for projects to develop and test requirements for electronic record keeping: University of Pittsburgh (1993): A working set of functional requirements and metadata specifications for electronic record keeping systems; City of Philadelphia (1995, 1996, and 1997): A project to incorporate a subset of the Pittsburgh metadata specifications into a new human resources information system and other city systems as test cases and to develop comprehensive record keeping policies and standards for the cityÔÇÖs information technology systems; Indiana University (1995): A project to develop an assessment tool and methodology for analyzing existing electronic records systems, using the Pittsburgh functional requirements as a model and the student academic record system and a financial system as test cases; Research Foundation of the State University of New York-Albany, Center for Technology in Government (1996): A project to identify best practices for electronic record keeping, including work by the U.S. Department of Defense and the University of British Columbia in addition to the University of Pittsburgh. The Center is working with the stateÔÇÖs Adirondack Parks Agency in a pilot project to develop a system model for incorporating record keeping and archival considerations into the creation of networked computing and communications applications. <P5> No definitive solution has yet been identified for the problems posed by electronic records, although progress has been made in learning what will be needed to design functional electronic record keeping systems. <P6> With the proliferation of digital libraries, the need for long-term storage, migration and retrieval strategies for electronic information has become a priority for a wide variety of information providers. <warrant>
Conclusions
RQ "How best to preserve existing and future electronic formats and provide access to them over time has remained elusive. The answers cannot be found through theoretical research alone, or even through applied research, although both are needed. Answers can only emerge over time as some approaches prove able to stand the test of time and others do not. The problems are large because the costs of maintaining, migrating, and retrieving electronic information continue to be high." ... "Perhaps most importantly, these grants have stimulated widespread discussion of electronic records issues among archivists and record managers, and thus they have had an impact on the preservation of the electronic documentary record that goes far beyond the CommissionÔÇÖs investment."
SOW
DC The National Historic Publications and Records Commission (NHPRC) is the outreach arm of the National Archives and makes plans for and studies issues related to the preservation, use and publication of historical documents. The Commission also makes grants to non-Federal archives and other organizations to promote the preservation use of America's documentary heritage.
Type
Report
Title
Mapping of the Encoded Archival Description DTD Element Set to the CIDOC CRM
The CIDOC CRM is the first ontology designed to mediate contents in the area of material cultural heritage and beyond, and has been accepted by ISO TC46 as work item for an international standard. The EAD Document Type Definition (DTD) is a standard for encoding archival finding aids using the Standard Generalized Markup Language (SGML). Archival finding aids are detailed guides to primary source material which provide fuller information than that normally contained within cataloging records. 
Publisher
Institute of Computer Science, Foundation for Research and Technology - Hellas
Publication Location
Heraklion, Crete, Greece
Language
English
Critical Arguements
CA "This report describes the semantic mapping of the current EAD DTD Version 1.0 Element Set to the CIDOC CRM and its latest extension. This work represents a proof of concept for the functionality the CIDOC CRM is designed for." 
Conclusions
RQ "Actually, the CRM seems to do the job quite well ÔÇô problems in the mapping arise more from underspecification in the EAD rather than from too domain-specific notions. "┬á... "To our opinion, the archival community could benefit from the conceptualizations of the CRM to motivate more powerful metadata standards with wide interoperability in the future, to the benefit of museums and other disciplines as well."
SOW
DC "As a potential international standard, the EAD DTD is maintained in the Network Development and MARC Standards Office of the Library of Congress in partnership with the Society of American Archivists." ... "The CIDOC Conceptual Reference Model (see [CRM1999], [Doerr99]), in the following only referred to as ┬½CRM┬╗, is outcome of an effort of the Documentation Standards Group of the CIDOC Committee (see ┬½http:/www.cidoc.icom.org┬╗, ÔÇ£http://cidoc.ics.forth.grÔÇØ) of ICOM, the International Council of Museums beginning in 1996."
In July 1999, the Australian Recordkeeping Metadata Schema (RKMS) was approved by its academic and industry steering group. This metadata set now joins other community specific sets in being available for use and implementation into workplace applications. The RKMS has inherited elements from and built on many other metadata standards associated with information management. It has also contributed to the development of subsequent sector specific recordkeeping metadata sets. The importance of the RKMS as a framework for 'mapping' or reading other sets and also as a standardised set of metadata available for adoption in diverse implementation environments is now emerging. This paper explores the context of the SPIRT Recordkeeping Metadata Project, and the conceptual models developed by the SPIRT Research Team as a framework for standardising and defining Recordkeeping Metadata. It then introduces the elements of the SPIRT Recordkeeping Metadata Schema and explores its functionality before discussing implementation issues with reference to document management and workflow technologies.
Critical Arguements
CA Much of the metadata work done so far has worked off the passive assumption of records as document-like objects. Instead, they need to be seen as active entities in business transactions.
Conclusions
RQ In order to decide which elements are to be used from the RKMS, organizations need to delineate the reach of specific implementations as far as how and when records need to be bound with metadata.
Type
Web Page
Title
CDL Digital Object Standard: Metadata, Content and Encoding
This document addresses the standards for digital object collections for the California Digital Library 1. Adherence to these standards is required for all CDL contributors and may also serve University of California staff as guidelines for digital object creation and presentation. These standards are not intended to address all of the administrative, operational, and technical issues surrounding the creation of digital object collections.
Critical Arguements
CA These standards describe the file formats, storage and access standards for digital objects created by or incorporated into the CDL as part of the permanent collections. They attempt to balance adherence to industry standards, reproduction quality, access, potential longevity and cost.
Conclusions
RQ not applicable
SOW
DC "This is the first version of the CDL Digital Object Standard. This version is based upon the September 1, 1999 version of the CDL's Digital Image Standard, which included recommendations of the Museum Educational Site Licensing Project (MESL), the Library of Congress and the MOA II participants." ... "The Museum Educational Site Licensing Project (MESL) offered a framework for seven collecting institutions, primarily museums, and seven universities to experiment with new ways to distribute visual information--both images and related textual materials. " ... "The Making of America (MoA II) Testbed Project is a Digital Library Federation (DLF) coordinated, multi-phase endeavor to investigate important issues in the creation of an integrated, but distributed, digital library of archival materials (i.e., digitized surrogates of primary source materials found in archives and special collections). The participants include Cornell University, New York Public Library, Pennsylvania State University, Stanford University and UC Berkeley. The Library of Congress white papers and standards are based on the experience gained during the American Memory Pilot Project. The concepts discussed and the principles developed still guide the Library's digital conversion efforts, although they are under revision to accomodate the capabilities of new technologies and new digital formats." ... "The CDL Technical Architecture and Standards Workgroup includes the following members with extensive experience with digital object collection and management: Howard Besser, MESL and MOA II digital imaging testbed projects; Diane Bisom, University of California, Irvine; Bernie Hurley, MOA II, University of California, Berkeley; Greg Janee, Alexandria Digital Library; John Kunze, University of California, San Francisco; Reagan Moore and Chaitanya Baru, San Diego Supercomputer Center, ongoing research with the National Archives and Records Administration on the long term storage and retrieval of digital content; Terry Ryan, University of California, Los Angeles; David Walker, California Digital Library"
The creation and use of metadata is likely to become an important part of all digital preservation strategies whether they are based on hardware and software conservation, emulation or migration. The UK Cedars project aims to promote awareness of the importance of digital preservation, to produce strategic frameworks for digital collection management policies and to promote methods appropriate for long-term preservation - including the creation of appropriate metadata. Preservation metadata is a specialised form of administrative metadata that can be used as a means of storing the technical information that supports the preservation of digital objects. In addition, it can be used to record migration and emulation strategies, to help ensure authenticity, to note rights management and collection management data and also will need to interact with resource discovery metadata. The Cedars project is attempting to investigate some of these issues and will provide some demonstrator systems to test them.
Notes
This article was presented at the Joint RLG and NPO Preservation Conference: Guidelines for Digital Imaging, held September 28-30, 1998.
Critical Arguements
CA "Cedars is a project that aims to address strategic, methodological and practical issues relating to digital preservation (Day 1998a). A key outcome of the project will be to improve awareness of digital preservation issues, especially within the UK higher education sector. Attempts will be made to identify and disseminate: Strategies for collection management ; Strategies for long-term preservation. These strategies will need to be appropriate to a variety of resources in library collections. The project will also include the development of demonstrators to test the technical and organisational feasibility of the chosen preservation strategies. One strand of this work relates to the identification of preservation metadata and a metadata implementation that can be tested in the demonstrators." ... "The Cedars Access Issues Working Group has produced a preliminary study of preservation metadata and the issues that surround it (Day 1998b). This study describes some digital preservation initiatives and models with relation to the Cedars project and will be used as a basis for the development of a preservation metadata implementation in the project. The remainder of this paper will describe some of the metadata approaches found in these initiatives."
Conclusions
RQ "The Cedars project is interested in helping to develop suitable collection management policies for research libraries." ... "The definition and implementation of preservation metadata systems is going to be an important part of the work of custodial organisations in the digital environment."
SOW
DC "The Cedars (CURL exemplars in digital archives) project is funded by the Joint Information Systems Committee (JISC) of the UK higher education funding councils under Phase III of its Electronic Libraries (eLib) Programme. The project is administered through the Consortium of University Research Libraries (CURL) with lead sites based at the Universities of Cambridge, Leeds and Oxford."
Type
Web Page
Title
Metadata for preservation : CEDARS project document AIW01
This report is a review of metadata formats and initiatives in the specific area of digital preservation. It supplements the DESIRE Review of metadata (Dempsey et al. 1997). It is based on a literature review and information picked-up at a number of workshops and meetings and is an attempt to briefly describe the state of the art in the area of metadata for digital preservation.
Critical Arguements
CA "The projects, initiatives and formats reviewed in this report show that much work remains to be done. . . . The adoption of persistent and unique identifiers is vital, both in the CEDARS project and outside. Many of these initiatives mention "wrappers", "containers" and "frameworks". Some thought should be given to how metadata should be integrated with data content in CEDARS. Authenticity (or intellectual preservation) is going to be important. It will be interesting to investigate whether some archivists' concerns with custody or "distributed custody" will have relevance to CEDARS."
Conclusions
RQ Which standards and initiatives described in this document have proved viable preservation metadata models?
SOW
DC OAIS emerged out of an initiative spearheaded by NASA's Consultative Committee for Space Data Systems. It has been shaped and promoted by the RLG and OCLC. Several international projects have played key roles in shaping the OAIS model and adapting it for use in libraries, archives and research repositories. OAIS-modeled repositories include the CEDARS Project, Harvard's Digital Repository, Koninklijke Bibliotheek (KB), the Library of Congress' Archival Information Package for audiovisual materials, MIT's D-Space, OCLC's Digital Archive and TERM: the Texas Email Repository Model.
CA In March 2003, the intention of undertaking an international survey of LOM implementations was announced at the plenary meeting of the "Information Technology for Learning, Education and Training", ISO/IEC JTC1/SC36 sub-committee. The ISO/IEC JTC1/SC36 committee is international in both membership and emphasis, and has a working group, Working Group (WG) 4, "Management and Delivery for Learning, Education, and Training," which has been explicitly charged with the task of contributing to future standardization work on the LOM. <warrant> The international LOM Survey focuses on two questions: 1) "Which elements were selected for use or population?"; and 2) "How were these elements used, or what where the types of values assigned to them?" This report also attempts to draw a number of tentative suggestions and conclusions for further standardization work
Conclusions
RQ Based on its findings, the preliminary survey report was able to suggest a number of conclusions: First, fewer and better-defined elements may be more effective than the range of choice and interpretive possibilities currently allowed by the LOM. This seems to be especially the case regarding educational elements, which are surprisingly underutilized for metadata that it ostensibly and primarily educational. Second, clear and easily-supported means of working with local, customized vocabularies would also be very valuable. Third, it also seems useful to ensure that structures are provided to accommodate complex but more conventional aspects of resource description. These would include multiple title versions, as well as multilingual descriptions and values.
SOW
DC On June 12, 2002, 1484.12.1 - 2002 Learning Object Metadata (LOM) was approved by the IEEE-Standards Association.
Type
Web Page
Title
Towards a Digital Rights Expression Language Standard for Learning Technology
CA The Learning Technology Standards Committee (LTSC) of the Institute for Electrical and Electronic Engineers (IEEE) concentrated on making recommendations for standardizing a digital rights expression language (DREL) with the specific charge to (1) Investigate existing standards development efforts for DREL and digital rights. (2) Gather DREL requirements germane to the learning, education, and training industries. (3) Make recommendations as to how to proceed. (4) Feed requirements into ongoing DREL and digital rights standardization efforts, regardless of whether the LTSC decides to work with these efforts or embark on its own. This report represents the achievement of these goals in the form a of a white paper that can be used as reference for the LTSC, that reports on the current state of existing and proposed standardization efforts targeting digital rights expression languages and makes recommendations concerning future work.
Conclusions
RQ The recommendations of this report are: 1. Maintain appropriate liaisons between learning technology standards development organizations and those standards development organizations standardizing rights expression languages. The purpose of these liaisons is to continue to feed requirements into broader standardization efforts and to ensure that the voice of the learning, education and training community is heard. 2. Support the creation of application profiles or extensions of XrML and ODRL that include categories and vocabularies for roles common in educational and training settings. In the case of XrML, a name space for local context may be needed. (A name space is required for both XrML and ODRL for the ÔÇ£application profileÔÇØ or specifically the application ÔÇôLT application- extension) 3. Advocate the creation of a standard for expressing local policies in ways that can be mapped to rights expressions. This could be either through a data model or through the definition of an API or service. 4. Launch an initiative to identify models of rights enforcement in learning technology and to possibly abstract a common model for use by architecture and framework definition projects. 5. Further study the implications of patent claims, especially for educational and research purposes.
Abstract The ability of investigators to share data is essential to the progress of integrative scientific research both within and across disciplines. This paper describes the main issues in achieving effective data sharing based on previous efforts in building scientific data networks and, particularly, recent efforts within the Earth sciences. This is presented in the context of a range of information architectures for effecting differing levels of standardization and centralization both from a technology perspective as well as a publishing protocol perspective. We propose a new Metadata Interchange Format (.mif) that can be used for more effective sharing of data and metadata across digital libraries, data archives and research projects.
Critical Arguements
CA "In this paper, we discuss two important information technology aspects of the electronic publication of data in the Earth sciences, metadata, and a variety of different concepts of electronic data publication. Metadata are the foundation of electronic data publications and they are determined by needs of archiving, the scientific analysis and reproducibility of a data set, and the interoperability of diverse data publication methods. We use metadata examples drawn from the companion paper by Staudigel et al. (this issue) to illustrate the issues involved in scaling-up the publication of data and metadata by individual scientists, disciplinary groups, the Earth science community-at-large and to libraries in general. We begin by reviewing current practices and considering a generalized alternative." ... 'For this reason, we will we first discuss different methods of data publishing via a scientific data network followed by an inventory of desirable characteristics of such a network. Then, we will introduce a method for generating a highly portable metadata interchange format we call .mif (pronounced dot-mif) and conclude with a discussion of how this metadata format can be scaled to support the diversity of interests within the Earth science community and other scientific communities." ... "We can borrow from the library community the methods by which to search for the existence and location of data (e.g., Dublin Core http://www.dublincore.org) but we must invent new ways to document the metadata needed within the Earth sciences and to comply with other metadata standards such as the Federal Geographic Data Committee (FGDC). To accomplish this, we propose a metadata interchange format that we call .mif that enables interoperability and an open architecture that is maximally independent of computer systems, data management approaches, proprietary software and file formats, while encouraging local autonomy and community cooperation. "
Conclusions
RQ "These scalable techniques are being used in the development of a project we call SIOExplorer that can found at http://sioexplorer.ucsd.edu although we have not discussed that project in any detail. The most recent contributions to this discussion and .mif applications and examples may be found at http:\\Earthref.org\metadata\GERM\."
SOW
DC This article was written by representatives of the San Diego Supercomputer Center and the Insititute of Geophysics and Planetary Physics under the auspices of the University of California, San Diego.
CA The role of archives and archivists is being fundamentally redefined in consideration of postcustodial theories and practice.
Conclusions
RQ Who is accountable? How explicit should the "imprint" of the archivist be in the shaping of the record? Who decides (and how) what we remember and what we keep?
Type
Web Page
Title
Creating and Documenting Text: A Guide to Good Practice
CA "The aim of this Guide is to take users through the basic steps involved in creating and documenting an electronic text or similar digital resource. ... This Guide assumes that the creators of electronic texts have a number of common concerns. For example, that they wish their efforts to remain viable and usable in the long-term, and not to be unduly constrained by the limitations of current hardware and software. Similarly, that they wish others to be able to reuse their work, for the purposes of secondary analysis, extension, or adaptation. They also want the tools, techniques, and standards that they adopt to enable them to capture those aspects of any non-electronic sources which they consider to be significant -- whilst at the same time being practical and cost-effective to implement."
Conclusions
RQ "While a single metadata scheme, adopted and implemented wholescale would be the ideal, it is probable that a proliferation of metadata schemes will emerge and be used by different communities. This makes the current work centred on integrated services and interoperability all the more important. ... The Warwick Framework (http://www.ukoln.ac.uk/metadata/resources/wf.html) for example suggests the concept of a container architecture, which can support the coexistence of several independently developed and maintained metadata packages which may serve other functions (rights management, administrative metadata, etc.). Rather than attempt to provide a metadata scheme for all web resources, the Warwick Framework uses the Dublin Core as a starting point, but allows individual communities to extend this to fit their own subject-specific requirements. This movement towards a more decentralised, modular and community-based solution, where the 'communities of expertise' themselves create the metadata they need has much to offer. In the UK, various funded organisations such as the AHDS (http://ahds.ac.uk/), and projects like ROADS (http://www.ilrt.bris.ac.uk/roads/) and DESIRE (http://www.desire.org/) are all involved in assisting the development of subject-based information gateways that provide metadata-based services tailored to the needs of particular user communities."
This guide is optimized for creation of EAD-encoded finding aids for the collections of New York University and New York Historical Society. The links on the page list tools and files that may be downloaded and referenced for production of NYU-conformant finding aids.
Publisher
New York University
Critical Arguements
CA "This guide is optimized for creation of EAD-encoded finding aids for the collections of New York University and New York Historical Society. Instructions assume the use of NoteTab as the XML editor, utilizing template files that serve as base files for the different collections." 
Conclusions
RQ
SOW
DC This guide serves both New York University and the New York Historical Society.
Expanded version of the article "Ensuring the Longevity of Digital Documents" that appeared in the January 1995 edition of Scientific American (Vol. 272, Number 1, pp. 42-7).
Publisher
Council on Library and Information Resources
Critical Arguements
CA "It is widely accepted that information technology is revolutionizing our concepts of documents and records in an upheaval at least as great as the introduction of printing, if not of writing itself. The current generation of digital records therefore has unique historical significance; yet our digital documents are far more fragile than paper. In fact, the record of the entire present period of history is in jeopardy. The content and historical value of many governmental, organizational, legal, financial, and technical records, scientific databases, and personal documents may be irretrievably lost to future generations if we do not take steps to preserve them."
Conclusions
RQ "We must develop evolving standards for encoding explanatory annotations to bootstrap the interpretation of digital documents that are saved in nonstandard forms. We must develop techniques for saving the bit streams of software-dependent documents and their associated systems and application software. We must ensure that the hardware environments necessary to run this software are described in sufficient detail to allow their future emulation. We must save these specifications as digital documents, encoded using the bootstrap standards developed for saving annotations so that they can be read without special software (lest we be recursively forced to emulate one system in order to learn how to emulate another). We must associate contextual information with our digital documents to provide provenance as well as explanatory annotations in a form that can be translated into successive standards so as to remain easily readable. Finally, we must ensure the systematic and continual migration of digital documents onto new media, preserving document and program bit streams verbatim, while translating their contextual information as necessary."
Type
Web Page
Title
Metadata Resources: Metadata Encoding and Transmission Standard (METS)
DC OAIS emerged out of an initiative spearheaded by NASA's Consultative Committee for Space Data Systems. It has been shaped and promoted by the RLG and OCLC. Several international projects have played key roles in shaping the OAIS model and adapting it for use in libraries, archives and research repositories. OAIS-modeled repositories include the CEDARS Project, Harvard's Digital Repository, Koninklijke Bibliotheek (KB), the Library of Congress' Archival Information Package for audiovisual materials, MIT's D-Space, OCLC's Digital Archive and TERM: the Texas Email Repository Model.