The following is a paper written by one of HCLE’s metadata volunteers, Svetlana Ushakova, San Jose State University. Reprinted (and slightly edited) with permission.
“An open nature of OER makes the question of usability of metadata standards very important because most OER projects are done by volunteers or with limited financial support.” – Svetlana Ushakova (@SvetUsha73)
Usability of Metadata Standards for Open Educational Resources:
The Perspective of Creators and Contributors of Resources
Svetlana Ushakova
San Jose State University
The emergence and development of Internet as the most convenient way, so far, of sharing information resources has given birth to many ideas about how to make the world a better place. Among them are ideas of sharing knowledge for free instead of transferring it into a commodity, especially when knowledge is used for education. In the 1990’s several people and institutions put their educational resources online as open documents. This was the beginning of a movement and, in 2002, the term “open educational resources” (OER) was coined for the first time (Organisation for Economic Co-operation and development, Center for Educational Research and Innovation [OECD, CERI], 2007, pp. 23-24). There are several definitions of OER, but the main idea they share in common is: documents and media that are free for access and may be used and reused for teaching, learning and research (“Open educational resources”, 2015). Even though not all definitions say that OER are necessary digital documents, the way they exist and are shared (via the Internet) implies that they should be in digital format. For twenty years, the community of people and institutions involved in creating and using OER, as well as the infrastructure for supporting OER, has been actively developing. As a result, now there are several repositories that facilitate sharing of OER, schemas for special licensing, particular tools and software, etc.
Metadata is an important part of that infrastructure. For example, in 2009, two years after launching one of the biggest OER digital libraries, OER Commons had already over 20,000 resources from nearly seventy partner collections (“Why OER?”, 2010). Metadata is crucial for searching through such collections successfully and classifying sources by certain criteria. The search by key words only wouldn’t be effective enough not only because it usually doesn’t work well for extracting such attributes as author, title, subject area, etc., but also because there are a lot of non-textual resources among OER. Besides that, OER have their own attributes that are not covered directly by general metadata standards, e.g. type of educational sources, grade level, alignment to educational standards, etc. Metadata is also an important tool for quality evaluation of OER (always an essential issue for any kind of open resources) because metadata helps to find the resource most suitable for a particular user case. Also, some metadata imply the comparison of a resource with a particular standard (e.g. educational standard, grade level, etc.), and many current OER metadata standards include well-developed system of rating.
The OER community has advanced a lot in developing and implementing specialized OER metadata schemas. A leader in that area is the OER digital library, OER Commons. It was launched in 2007 by the Institute for the Study of Knowledge Management in Education (ISKME). Even before that, in 2004, ISKME, with a grant from the William and Flora Hewlett Foundation, started a research project aimed at developing a metadata standard for a new OER digital library. In order to do it, they analyzed and cross-walked metadata standards used by other OER providers, such as MIT OpenCourseWare and Connexions, as well as general standards, e.g. Dublin Core (Petrides, 2013). Parallel research was conducted by some European institutions, e.g. European Schoolnet (OECD, CERI, p. 101 – 103). Later in 2012, OER Commons, together with nonprofit education reform organization Achieve, developed and incorporated the evaluation tool (“Helping educators determine the quality”, [2012]) into their metadata schema. The metadata standards created by OER Commons are not the only one that are used to describe OER, but they are may be considered as one of the most robust. In 2012, the U.S. Department of Education launched the tool “Learning Registry” that aims to collect metadata about OER from different platforms and resources so educators can start their search for OER at one place (Cavanagh, 2014).
There are still a lot of problems and questions about using metadata for OER that are actively being discussed. Some of them are common for metadata in general, e.g. the problem of interoperability between metadata used by different content providers, OER libraries and repositories as well as between metadata for OER and common metadata standards. The problem of creating controlled vocabulary, and the problem of quality and completeness of metadata are also unsolved. Other considerations are more related to the open nature of OER, and here the main issue is rights and licensing of metadata.
The problem I’m going to explore is common for metadata standards in general, but has some special aspects regarding the open nature of resources. It may be labeled as a problem of usability of current metadata standards, but actually includes some other questions. Usability is to be studied from the point of creators (or sometimes of providers) of resources and from the point of users of OER, i.e. educators or learners. From the perspective of creators of OER, the main issue is that most of them don’t have special skills and knowledge about metadata in general and about OER metadata standards particularly. From the point of view of educators and learners who use OER, the main problem is how much the existent OER metadata standards facilitate the retrieval of resources. The problem includes the questions of sufficiency of metadata elements, balance of quality and cost, and comprehensibility and intuitiveness of controlled vocabulary. The goal perspective of creators (contributors) of OER. The special questions I am interested in are: 1) what ways of adding metadata to OER exist and what pros and cons they have, and 2) what kind of knowledge and skills a creator should have to add OER metadata.
In order to answer the questions, I will use retrospective and descriptive methods that allow me to study how the OER metadata schemas have been developed and how different issues with usability are being addressed. The other method is comparison of metadata schemas. User tests will be conducted to analyze the usability of metadata schemas.
Literature Review
As OER movement is a relatively new phenomenon, there are not a lot of publication in the form of books or even scholarly articles especially on such specific aspects as OER metadata schemas ant its usability. However, there are many reports, specifications, presentations, discussion posts, etc. that accompany the active process of developing metadata schemas for OER.
One of the first reports was prepared and published in 2007 by the Centre for Educational Research and Innovation (CERI) that is a committee of the Organisation for Economic Co-operation and Development (OECD). The report was based only on surveys of OER for higher education, but most tendencies and issues revealed in the report were related to OER in general. They discussed several important aspects of OER and among them the question about metadata as a way of effective searching for OER (OECD-CERI, 2007, p. 100, 101). For them, proper metadata is a tool that helps to ensure and improve the quality of OER (p. 100). This statement was repeated in their recent presentation on OER (Van Damme, 2014, sl. 27-29).
According to OECD-CERI research, the main problems were the time-consuming nature of adding metadata and the discrepancy between the perspective of a person who added metadata and people who will use them for search (OECD-CERI, 2007p. 100). As an alternative way of creating metadata, they considered the project of Metadata Ecology for Learning and Teaching (MELT) realized by European Schoolnet. This project combined two approaches: “a priori metadata (before use) added by expert indexers and a posteriori metadata (after use) gathered from teachers/learners and machines,” (p. 101). The results of the MELT project were presented in the final report in 2009 (Assche et al., 2009). Key results were: 1) the importance of enrichment of OER metadata with social tagging, 2) how automated generation of some metadata and automated translation of metadata values helped to increase the usage of OER by users with a native language other than the language of the resource and 3) the unexpected areas and ways indexers coded the metadata (Assche et al., 2009, p. 3). The reports contain valuable conclusions about the process and effectiveness of social tagging (folksonomy) of OER. For example, they noticed that users who participated in social tagging tend to use tags from the controlled vocabulary (p. 24). A quite unexpected result was the fact that even though social tags were found useful for describing resources, they were not so useful for finding resources (p. 31. 32).
The usability of metadata may not be explored without study of the history of creation and development of current OER metadata schemas and without an overview of those schemas. One of the earliest metadata schemas created to describe and administrate learning resources, both open and commercial, was a standard designed in 2002 by the Institute of Electrical and Electronics Engineers Standards Association (IEEE) – the standard for Learning Object Metadata (LOM). The standard is actively used by publishers and digital repositories. The creators of other OER metadata standards often use it as a starting point to develop their own standards. It is a reason why there is a relatively large literature on LOM, including scholarly articles and conference papers. A briefing paper prepared by P. Barker (Centre for Educational Technology, Interoperability and Standards (CETIS)) described the LOM data model, its elements and vocabulary of values, areas of using, as well as recommendations about creating user-oriented application profiles for LOM and links to some profiles (Barker, 2005). The usage of LOM raised some practical questions and issues, for example limitations and complexity of LOM encountered by developers of customized application profiles (Agostinho, Bennett, Lockyer, & Harper, 2004, p. 204, 205).
A serious problem was revealed by a survey conducted by a group of Canadian specialists involved in the development of another metadata standard CanCore. According to the survey, the efforts of IEEE to develop a standard more elaborated than Dublin Core (LOM has 76 elements against 15 of Dublin Core) were mostly in vain because most customized application profiles included as required elements only those that are taken from Dublin Core and the new elements, which are optional, are hardly used by indexers. The same might be said about the elaborated set of values (Freisen, 2004). Even though we have to take into account that this survey and the report on it were prepared by LOM’s competitors who consistently criticized the IEEE’s product for its excessive complexity (Friesen, Roberts, & Fisher, 2002, “Educational Metadata: The Ieee Lom Standard”), and even I was not able to find a proper description of their survey (the links mentioned in the article don’t work anymore), the problem of the right balance between the elaboration and usability does exist. Other authors also pointed it out. As mentioned above, the same tendency was noticed in the OECD – CERI report (2007, p. 24). J. Najjar and E. Duval (2006) made a series of studies on learning object repositories built on LOM standards and concluded that both indexers (adding metadata) and users (searching for a resource) tend to use a small amount of metadata elements and values. However, unlike N. Freisen, they saw the reason for that not only in the excessive complicity of LOM, but also in other factor: that the indexing tool interfacess were not adapted for indexers and that of search tool default settings contained a limited number of very common tools (Najjar & Duval, 2006).
In the following ten years after creating LOM, various governmental, non-profit, and commercial institutions in different countries have continued to adapt already existing metadata schemas and to develop new ones, including schemas for OER. In 2011, the Learning Registry Metadata Initiative (LRMI) project was started. The goal of the project was to develop special properties that should be added to the schema.org framework to describe learning objects. The documentation on LRMI may be found on LRMI website. The description of the schema and the process of its implementation by ten OER repositories are discussed in two papers prepared be CETIS experts (Barker, 2014) (Campbell & Barker, 2014).
Many OER repositories try to create their own metadata schema or at least application profiles. One of the most successful here is a metadata schema developed by a digital OER library – OER Commons. The president of the Institute for the Study of Knowledge Management in Education (ISKME), the parent organization of OER Commons, L. Petrides (2013) gave the briefly overview of the history of creating and developing OER Commons metadata profile. Their credo was to create rich metadata to enhance the retrieval of OER, and it required from them a lot of efforts, including technical tutoring for content creators. They also survived a period of doubts about usefulness of rich metadata, and now they actively participate in various projects of sharing metadata, e.g. Learning Registry (Cavanagh, 2014) and map their metadata with other schemas, e.g. LRMI (Petrides, 2013). The blog post of a CETIS expert L. M. Campbell (2014) contains more technical details about OER Commons metadata, including the descriptions of four ways metadata can be added to resources collected by the repository. According to Campbell, OER Commons metadata schema and infrastructure are robust and allow different approaches for adding metadata and compiling it with other standards.
OER Commons metadata schema is one of the first metadata standards for learning objects that embed the information about the alignment of a resource with educational standards, such as Common Core. This data is a part of the evaluation tool that was implemented by ISKME in partnership with a nonprofit education reform organization Achieve. The creators of the tool consider it to be a mechanism that helps improve the quality of OER and the effectiveness of search. Those goals were announced in the press release of two institutions (ISKME & Achieve, 2011). The more detailed information about evaluation rubrics and training materials for users are on Achieve website (OER rubrics, n.d.). Achieve prepared an elaborated instruction document for users explaining how to align a resource with an educational standard and rate the degree of alignment. According to the instruction, it’s a complicated process that requires “the input of knowledgeable and experienced teachers and administrators,” (Achieve, 2013, sl. 27). However, I was not able to find information about how many users participated in that evaluation, the quality of the evaluation, and whether OER Commons practices other ways (e.g. author’s input) to align a resource to an educational standard.
In general, alignment of OER to educational standards and adding that information to metadata is a popular topic in the OER community. It is an important element of learning object metadata because it helps educators significantly in finding a proper resource among thousands of others, and because it is an indicator of quality. After ten years of development, the Achievement Standards Network (ASN) offers an open access service to competency framework data. It is not only the library of all current educational standards in the USA and UK, but it is a linked data standard that supports crosswalks to different standards and aligning learning resources to them. The presentation of the ASN developer S.A. Sutton (2012) gives the information about how it works and where it may be applied. The website of ASN offers demos and tools that allows users to test ASN as a data network.
The literature describes various aspects of metadata schemas for OER, including questions about user experience. At the same time, as most documents come from the creators of those schemas (particularly for relatively new schemas) there are not enough critical and analytical papers about usability of metadata schemas to permit scholarly evaluation of competing schemas.
User Tests
To analyze how creators or contributors of OER may add necessary metadata for their resources, I tested two main ways. The first way is to add metadata directly to an html file. Another is to submit a resource to one of OER digital libraries that provide interface aimed to create a set of metadata.
The first way is more convenient for organizations and individuals who have a collection of OER and the infrastructure (i.e. website) to place their resources. They usually need only a tool to improve the retrieval of their resources. My goal was created metadata for a very popular educational video game “Oregon Trail” (version of 1990). The first version of the game was published by Minnesota Educational Computing Consortium (MECC) in 1974. In December 2014, the Internet Archive published online the 1990 version (MECC, 1990) and the 1992 version (MECC, 1992), and those versions are available for free. The initial web page (MECC, 1990) contains some metadata encoded in Open Graph (a schema that is widely used for social networking websites). Namely, there are description, title, site name, and url properties. To create metadata associated with educational resources, I chose one of the most popular OER metadata schemas – LRMI.
As it was mentioned above, LRMI was developed as an extension of schema.org, but in 2014 stewardship of the LRMI version 1.1 specification was transferred to the Dublin Core Metadata Initiative. One of the reasons of that transition was a decision to publish the LRMI schema in RDF (Sutton, 2015). Nevertheless, all classes and properties of LRMI in DCMI have the corresponding URI in schema.org. Because of it, I created two versions of metadata using DC and schema.org encoding (see Appendices A and B correspondently). Most elements and properties used by DC and schema.org for describing educational resources are quite similar and interconvertible.
Both version were validated with Structured Data Testing Tool. The metadata elements of the DC version were extracted by the tool as “custom search results”. However, all elements that were defined by links (for example, <link rel=”DC.Subject” href=”http://id.loc.gov/authorities/ subjects/sh85096196″>) were not extracted at all. It doesn’t necessary mean that they are not valid as metadata, and it should be tested in other way. The metadata elements of the schema.org version were extracted as metadata for a type of web documents that in schema.org are known as creative work (see the screenshot of the evaluation in the Appendix B). The evaluation has proved that with LRMI “Oregon Trail” got much more visibility as an educational resource. The most important properties defined with LRMI are purpose of using a resource for education, expected age range, and alignment with educational standards. The content for the last one was taken from Achievement Standard Network. I aligned the resource only with Californian standards for history and social science, but ASN also allows to align that standard with standards for other states.
Adding metadata to html files required special skills and knowledge. I had to study the specifications for both schemas and several examples. There were also some difficulties in choosing values for several metadata elements because there is no controlled vocabulary for them and I was not sure which term is widely used and therefore more searchable. At the same time, adding metadata directly to html files gave the possibility to customize metadata regarding the peculiarity of the resource.
To test the second way of creating metadata for OER by uploading a resource to an OER digital library and using its interface, I chose OER Commons. They use their own metadata schema that is mapped to LRMI. The web site offers two options: submit a resource already existed in the Web and create a new resource. The user interface for each option is slightly different. Therefore, I submitted two documents: “Oregon Trail” and my own resource that is a Word document.
The first step for already existed documents includes general metadata properties, such as title, author, provider, and key words. The next step contains properties that are specific for educational resources: subject (in the meaning of an area of knowledge that is studied in school), educational use, primary user, education material types, grade level, education standard, media formats, and intended regional relevance. The third step defines the conditions of use. For a new source, a creator should also provide general information (title, abstract, keywords, and additional, besides the creator, authors) and define specific for educational resource metadata properties, such as general subjects, primary user, educational use, grade level, language, material type, and learning goals. The last step defines the type of license.
As we can see, sets of properties for two options are slightly different. For example, the second one doesn’t contain educational standard and media formats, while the first one doesn’t have language and learning goals. Probably, a contributor of already existed documents is considered as a user of the resource and, therefore, as a person who is more competent to define if the source is aligned with an educational standard. At the same time, the creator of a resource knows better about learning goals of a source. Presumably, these two properties, education standard and learning goals, may have the same value because educational standards are often formulated as a learning goal.
For many properties, the interface of OER Commons has a drop-down list or a combo box with possible options that is a way of using a controlled vocabulary for metadata. Most of them have sufficient number of options to define properties correctly. Yet, educational standards have only three options (though the main ones): Common Core State Standards for English language arts and mathematics and Next Generation Science Standard.
In the source code of the html files for submitted resources, there are elements and properties from several metadata schemas, including schema.org, open graph, hcard, and internal OER Commons schema. The validation of the files showed that only few metadata elements were recognized as a part of schema.org, some were identified as custom search results, and many were not identified. The reason is that many metadata elements used by OER Commons don’t have “meta” in their name. Because of it, we may assume that those OER Commons metadata elements are only for internal use and even though inside OER Commons resources are effectively searchable, they are not as much searchable outside of the library. However, in 2014 OER Commons has mapped their metadata to LRMI (Campbell, 2014), and we may suppose that internal for OER Commons metadata are also usable for the rest of the Web.
Results
The research project has confirmed that metadata standards specially designed to describe and manage reusable digital educational resources help to increase significantly the searchability of resources. Those metadata standards contain fields that most general standards don’t have, and many of those fields are important for users seeking for OER.
Two ways of adding metadata to OER were studied in this paper. The first way by adding metadata directly to an HTML file is more flexible and can be easily customized depending on a type of resources and goals and abilities of a creator (contributor). The second way by uploading a document to a digital repository and using its interface is also might be customized but only in the way of reducing the number of fields.
Adding metadata directly to an HTML file requires special knowledge and skills. Lack of knowledge of controlled vocabularies may affect the searchability of resources. With built-in interface of a digital repository a user doesn’t need special technical and library skills. The controlled vocabularies incorporated in the interface help to improve the retrieval of a resource, especially inside that particular repository.
There is a possibility that metadata created by submitting a resource to a digital repository may not be completely searchable outside of the repository because many digital repositories have their own metadata standard.
Discussions
The main limitation of the project of the study was its size. Under the study, there were two documents representing two type of OER, one metadata standard (LRMI) in its two versions, and one digital OER repository. Therefore, the results actually can be considered only as preliminary and should be used as a start for another project with documents of various types and several metadata standards.
Two ways of adding metadata to describe educational resources were not tested from users’ perspective to answer the question how much OER metadata standards have improved the searchability of resources. This is another limitation of our study, and it should be resolved later.
Conclusions
Since 1990’s, the OER movement has been developing significantly. Nowadays, OER play an important social role by helping to make education and self-education more affordable and accessible. Having proper metadata standards is essential to increase the functionality of OER. They help to find a right resource among thousands others and to guarantee the quality of resources. However, there is a discussion if we need metadata for OER at all and how detailed they should be. In my opinion, most of current OER metadata schemas are functional and detailed enough, and they may be easily customized for those who don’t need detailed metadata. Yet, for some type of OER (e.g. video games), there is a need of adding additional fields.
An open nature of OER makes the question of usability of metadata standards very important because most OER projects are done by volunteers or with limited financial support. Because of it, there is a need to create a tool that generates HTML code for OER metadata that may be added to an HTML file. Lists of recommended values (controlled vocabularies) for certain fields should be also developed and presented for OER creators (contributors).
References
- Achieve. (2013). Evaluating open educational resource (OER) objects. [PowerPoint Slides].
- Retrieved from http://achieve.org/oer-rubrics
- Agostinho, S., Bennett, S., Lockyer, L. & Harper, B. (2004). Developing a learning object metadata application profile based on LOM suitable for the Australian higher education context. Australasian Journal of Educational Technology, 20(2), 191-208.
- Assche, F. V., Ayre, J., Baumgartner, P., Duval, E., Hartinger, S., Mesdom, F., … Zens, B. (2009). MELT: Final report, 1/10/2006 – 31/03/2009.
- Retrieved from MELT website: http://info.melt-project.eu/shared/data/melt/MELT_1_3_Final_Project_Report.pdf
- Barker, P (2005). What is IEEE Learning Object Metadata / IMS Learning Resource Metadata?
- Retrieved from CETIS website: http://publications.cetis.org.uk/wp-content/uploads/2011/02/WhatIsIEEELOM.pdf
- Barker, P. (2014). Learning Resource Metadata Initiative: using schema.org to describe open educational resources.
- Retrieved from CETIS website: http://publications.cetis.org.uk/wp-content/uploads/2014/09/Paper_34-LMRI1.pdf
- Campbell, L. M. (2014, Oct 3). LRMI implementation case study: ISKME OER Commons.
- Retrieved from https://lornamcampbell.wordpress.com/2014/10/03/lrmi-implementation-case-study-iskme-oer-commons/
- Campbell, L. M. & Barker, P. (2014). LRMI implementation: Overview, issues and experiences.
- Retrieved from CETIS website: http://publications.cetis.org.uk/wp-content/uploads/2014/12/lrmiImplementationSynthesis.pdf
- Cavanagh, S. (2014, Jan 3). Federal “Learning Registry” aims to connect educators, content providers. Education Week (2014, Jan).
- Retrieved June 25, 2015 from http://blogs.edweek.org/edweek/DigitalEducation/2014/01/federal_learning_registry_grow.html
- Freisen, N. (2004). The international learning object metadata survey. The International Review of Research in Open and Distance Learning, 5(3).
- Friesen, N., Roberts, A., & Fisher, S. (2002). CanCore: Metadata for learning objects. Canadian Journal of Learning and Technology, 28 (3).
- Retrieved from http://www.cjlt.ca/php/cjlt/article/view/109/102
- Helping educators determine the quality of open education resources. [2012].
- Retrieved June 25, 2015 from http://www.achieve.org/helping-educators-determine-quality-open-education-resources.
- Institute for the Study of Knowledge Management in Education (ISKME) & Achieve. (2011,Nov 15). Evaluation tool for aligning open education resources to Common Core standards now available at OER Commons [press release].
- Retrieved from http://www.iskme.org/media/press-releases/evaluation-tool-aligning-open-education-resources-common-core-standards-now-ava
- MECC (1990). Oregon Trail .
- Retrieved from https://archive.org/details/msdos_Oregon_Trail_The_1990
- MECC (1992). Oregon Trail Deluxe .
- Retrieved from https://archive.org/details/msdos_Oregon_Trail_Deluxe_The_1992
- Najjar, J. & Duval, E. (2006). Actual use of learning objects and metadata: An empirical analysis. TCDL Bulletin, 2 (2).
- Retrieved from http://www.ieee-tcdl.org/Bulletin/v2n2/najjar/najjar.html
- OER Rubrics. (n.d.).
- Retrieved from Achieve website http://achieve.org/oer-rubrics.
- Organisation for Economic Co-operation and development, Center for Educational Research and innovation [OECD, CERI]. (2007). Giving knowledge for free: The emergence of open educational resources. [Paris, France?]: Author.
- Open educational resources (last update 2015, June 20).
- Retrieved June 22, 2015 from Wikipedia: https://en.wikipedia.org/wiki/Open_educational_resources
- Petrides, L. (2013). The selling of Open Educational Resources (OER).
- Retrieved June 25, 2015 from http://opensource.com/education/13/7/oer-selling-metadata
- Sutton, S.A. (2012, July). RDF Modeling of Learning Objectives & Outcomes: Global Connecting of Learning Resources [Video Presentation].
- Retrieved from https://s3.amazonaws.com/jestaticd2l/emp/sas/media/RDF+Modeling+of+Learning+Objectives/Overview/index.html
- Sutton, S. (2015, March 6). LRMI 1.1 Published as RDF.
- Retrieved fromhttp://www.lrmi.net/lrmi-1-1-published-as-rdf
- Van Damme, D. (2014). Open educational resources sharing content and knowledge differently is a driver of innovation in education [SlideShare slides].
- Retrieved from http://www.slideshare.net/OECDEDU/open-educational-resources-sharing-content-and-knowledge-differently-is-a-driver-of-innovation-in-education
- Why OER? (last modified 2010, August 26).
- Retrieved June 25, 2015 from OER Wiki: http://wiki.oercommons.org/index.php/Why_OER%3F
Appendix A
LRMI Metadata encoded in Dublin Core (HTML version)
for Oregon Trail (1990)
<title>metadata</title>
<link rel=”schema.DC” href=”http://purl.org/dc/elements/1.1/”>
<link rel=”schema.DCTERMS” href=”http://purl.org/dc/terms/”>
<link rel=”schema.LRMI” href=”http://dublincore.org/lrmi-terms/”>
<meta name=”DC.Title” content=”Oregon Trail”>
<meta name=”DC.Creator” content=”Minnesota Educational Computing Consortium”>
<link rel=”DC.Subject” href=”http://id.loc.gov/authorities/subjects/sh85052074″>
<link rel=”DC.Subject” href=”http://id.loc.gov/authorities/subjects/sh85096196″>
<meta name=”DC.Subject” scheme=”DCTERMS.LCSH” content=”http://id.loc.gov/authorities/subjects/sh85102313″>
<link rel=”DC.Subject” href=”http://id.loc.gov/authorities/subjects/sh85140204″>
<meta name=”DC.Description” content=”As a covered wagon party of pioneers, you head out west from Independence, Missouri to the Willamette River and valley in Oregon. You first must stock up on provisions, and then, while traveling, make decisions such as when to rest, how much food to eat, etc. The Oregon Trail incorporates simulation elements and planning ahead, along with discovery and adventure, as well as mini-game-like activities (hunting and floating down the Dalles River).”>
<meta name=”DCTERMS.Created” content=”1990″>
<link rel=”DC.Type” href=”http://purl.org/dc/dcmitype/InteractiveResource”>
<link rel=”DC.Type” href=”http://purl.org/dc/dcmitype/Software”>
<meta name=”DC.Type” scheme=”DCTERMS.DCMIType” content=”video game”>
<link rel=”DC.Identifier” href=”https://archive.org/details/msdos_Oregon_Trail_The_1990″>
<meta name=”DC.Language” content=”English”>
<link rel=”DCTERMS.HasVersion” href=”https://archive.org/details/msdos_Oregon_Trail_Deluxe_The_1992″>
<meta name=”DC.Coverage” scheme=”Period” content=”1848″>
<link rel=”DC.Rights” href=”https://archive.org/about/terms.php”>
<meta name=”DCTERMS.RightsHolder” content=”MECC”>
<meta name=”DCTERMS.InstructionalMethod” content=”game”>
<meta name=”DCTERMS.InstructionalMethod” content=”simulation”>
<meta name=”LRMI.alignmentType” content=”teaches”>
<link rel=”LRMI.targetURL” href=http://purl.org/ASN/resources/S1042BEB>
<meta name=”LRMI.targetDescription” content=”Students compare and contrast everyday life in different times and places around the world and recognize that some aspects of people, places, and things change over time while others stay the same.”>
<meta name=”LRMI.alignmentType” content=”teaches”>
<link rel=”LRMI.targetURL” href=http://purl.org/ASN/resources/S10431D6>
<meta name=”LRMI.targetDescription” content=”Study transportation methods of earlier days.”>
<meta name=”LRMI.alignmentType” content=”teaches”>
<link rel=”LRMI.targetURL” href=http://asn.jesandco.org/resources/S1143AB2>
<meta name=”LRMI.targetDescription” content=”Integrate visual information (e.g., in charts, graphs, photographs, videos, or maps) with other information in print and digital texts.”>
<meta name=”LRMI.alignmentType” content=”teaches”>
<link rel=”LRMI.targetURL” href=http://purl.org/ASN/resources/S1042BED>
<meta name=”LRMI.targetDescription” content=”Students understand basic economic concepts and the role of individual choice in a free-market economy.”>
<meta name=”LRMI.educationalRole” content=”student”>
<meta name=”LRMI.educationalUse” content=”educational game”>
<meta name=”LRMI.interactivityType” content=”active”>
<meta name=”LRMI.timeRequired” content=”PT1H”>
<meta name=”LRMI.typicalAgeRange” content=”7-12″>
<metadata>
Appendix B
1) LRMI Metadata encoded in schema.org (HTML version) for Oregon Trail (1990)
<div itemscope itemtype=”http://schema.org/CreativeWork”>
<span itemprop=”name”>Oregon Trail</span>
by <span itemprop=”creator”>Minnesota Educational Computing Consortium</span>
<span itemprop=”about”>Frontier and pioneer life</span>
<span itemprop=”about”>Overland journeys to the Pacific</span>
<span itemprop=”about”>Pioneers</span>
<span itemprop=”about”>United States–History–1849-1877</span>
<span itemprop=”description”>As a covered wagon party of pioneers, you head out west from Independence, Missouri to the Willamette River and valley in Oregon. You first must stock up on provisions, and then, while traveling, make decisions such as when to rest, how much food to eat, etc. The Oregon Trail incorporates simulation elements and planning ahead, along with discovery and adventure, as well as mini-game-like activities (hunting and floating down the Dalles River).</span>
<span itemprop=”dateCreated”>1990</span>
<span itemprop=”url”>https://archive.org/details/msdos_Oregon_Trail_The_1990</span>
<span itemprop=”inLanguage”>en-US</span>
<span itemprop=”copyrightHolder”>MECC</span>
<span itemprop=”educationalUse”>educational game</span>
<span itemprop=”interactivityType”>active</span>
<span itemprop=”timeRequired”>PT1H</span>
<span itemprop=”typicalAgeRange”>7-12</span>
<span itemprop=”educationalAlignment” itemscope
itemtype=”http://schema.org/AlignmentObject”><meta itemprop=”alignmentType” content=”teaches”><meta itemprop=”targetUrl”
content=”http://purl.org/ASN/resources/S1042BEB” /><span itemprop=”targetDescription”>Students compare and contrast everyday life in different times and places around the world and recognize that some aspects of people, places, and things change over time while others stay the same.</span>
2) Result of validating metadata in the HTML file