Project Electron Update: Aurora Usability Testing

Following our initial release of Aurora, we’ve continued to improve the application through usability testing with RAC staff members. This process has been essential in identifying usability issues in Aurora and guiding our strategy to implement fixes to address those issues. In this post, I want to share our approach to usability testing, a summary of our findings and fixes, and our next steps.

Continue reading

1968: The Ford Foundation Gets a Computer

Today’s post comes from Rachel Wimpee, Historian and Project Director in our Research and Education division. Rachel uncovered this story while working with the Ford Foundation archives held at the RAC, and asked if it might be worth posting here. I only had to quickly skim the text to see the relevance for this blog.

A couple of broad themes jumped out at me when I read this piece. The first is the durability of modes of speaking and thinking about technology, which seem to persist despite (or perhaps because of) rapid technological changes. Artificial intelligence and machine learning, both hot tech trends currently, figure heavily in this story from 1965. You’ll also notice efficiency being employed as the ultimate justification for technology, even in a situation where increasing the profit margin didn’t apply. This story is also an excellent illustration of the socially constructed nature of technology. As Rachel’s piece reveals, technology is the result of consensus and compromise. There are negotiations mediated by money, practicality, and personality. Not only that, but technology and underlying processes are often so intertwined as to be indistinguishable, and each is often blamed for things the other produces.

In many ways, this is cautionary tale of what happens when we start with the new shiny thing rather than the needs of users (something that Evgeny Morozov and others have called “solutionism”). It’s not all bad, though. Rachel writes about the training plan implemented by the Ford Foundation at the same time staff began to use an IBM 360/30 mainframe for data processing in the late 1960s, as well as a regular process of evaluation and change implementation which lasted well into the 1970s. This reminded me of the importance of ongoing cycles of training and evaluation. New technologies usually require humans to learn new things, so a plan for both teaching and evaluating the effectiveness of that teaching should be part of any responsible technology change process. The D-Team is thinking a lot about training these days, particularly in the context of Project Electron, which will embed technologies into our workflows in holistic way. Even though the project won’t be complete until the end of the year, we’re already scheduling training to amplify our colleague’s existing skills and expertise so they can feel confident working with digital records.

Continue reading

SAA Workshop on Digital Records Acquisition: Archivists’ Reflections

Introduction

Educating archivists and record keepers is the first step in developing a digital program. Recently, members of the Processing and Collections Management teams at the Rockefeller Archive Center attended a two-day workshop titled “Appraisal, Accessioning, and Ingest of Digital Records” offered by SAA and presented by Erin Faulder, Digital Archivist at Cornell University Library’s Division of Rare and Manuscript Collections. The Digital Archives Specialist (DAS) course delved into the challenges of preserving and managing electronic records and offered strategies for institutions both unfamiliar and well versed in the realm of digital archiving.

Here are the reflections of four archivists who participated in the DAS workshop.

Continue reading

Learning from Liberating Structures

Like many managers, I have a lot of meetings, so I’m always looking for ways to make sure I get the most out of them. Am I hearing from everyone at the table? Are a group’s best ideas being surfaced, or am I just hearing from the extroverts? How can I get my team engaged in strategic planning? Consequently, I’m always on the lookout for tools and techniques to make meetings – one-on-ones, team conversations, administrative updates and beyond – useful, engaging and inclusive.

A couple of years ago, the always-incredible Tara Robertson pointed me towards Liberating Structures. Although I’ve experimented with them a bit over the past few years, I’ve struggled to cut through some of the jargon (particularly the innovation-speak, which especially bugs me) and, let’s face it, the information architecture and graphic design of the official website. However, Tara encouraged me to look for a training opportunity, and after several years, I was finally able to attend a Liberating Structures training led by Fisher Qua at NYC’s Outward Bound headquarters in Long Island City. This two-day intensive workshop made all the difference in helping me understand what Liberating Structures are and how they can be used. Continue reading

Documentation Site Release: A Tool for Access and Transparency, a Push for Better Documentation Writing

We are excited to announce the release of the Rockefeller Archive Center Documentation Website, a central platform for storing and sharing our institutional policies, workflows, guides, and other forms of documentation! We also want our site to generate more critical thought about how and why we write documentation, especially in terms of how we manage revisions to content, what formatting decisions we make in order to provide meaningful structure, and how we can use our documentation to contribute to the larger archives community. As a well-resourced institution, we feel a professional responsibility to be transparent about the tools and procedures we develop that may benefit our fellow archivists.

RAC Documentation Site homepage

Continue reading

The Values of Open Communities

I (relatively) recently came back from Open Repositories and have had a myriad of jumbled thoughts bouncing around in my head about aligning communities, values, software, and expectations within libraries and archives. Hopefully, this blog post will serve as an outline for the thoughts that have been percolating for a few weeks, and really, a few years before that. I’ve met a significant number of professionals that I know share these opinions as well, but I think it’s helpful to spend some time reflecting on the ideas they’ve imparted and how we, as members of a community, can better align our actions with our values, and the difficulties that work presents. Continue reading

Project Electron Update: Introducing Aurora 1.0

We are very pleased to announce the initial release of Aurora, an application to receive, virus check, and validate the structure and contents of digital records transfers. It provides a read-only interface for representatives of donor organizations to track transfers, so that they can follow their records as they move through the archival lifecycle. It also includes functionality for RAC staff to add or update organization accounts and users associated with them, appraise incoming transfers, and initiate the accessioning process. Aurora is built on community-driven standards and specifications, and we have released it as open source software. This is a major milestone for Project Electron, and we are excited to share it with the world. Many thanks to our partners at Marist College IT and to the Ford Foundation for their generous support of the project.

Aurora homescreen

We will continue to improve Aurora as we test and integrate it with a chain of other archival management and digital preservation tools.

Read more about Project Electron here.

Continue reading

Modeling for Project Electron

In her most recent blog post, Hannah wrote about our approach to Project Electron’s proposed systems integration architecture. One of our goals with Project Electron is to support the flow of data about digital materials between our systems and getting valuable information to researchers in new ways. Supporting data in motion is integral to Project Electron’s success, and while Hannah and Hillel have been hammering away at creating a comprehensive overview of the microservices architecture, I’ve been working with the entire archive center to develop a draft data model for discovery and display of born digital and digitized materials. If, as we’ve been thinking, Project Electron is about creating infrastructure to support data, a data model will in turn act as a blueprint for that infrastructure. Data models are tools we can use to communicate and define how we want data to move between systems, and we think understanding how our data will move throughout our systems to our researchers is vital to the success of the entire project. Continue reading

2018 IA Summit Report: Designing for Ecosystems

I recently attended the IA Summit 2018 in Chicago. This was my first time attending the conference, which brings together a mix of information architects and design-related professionals, and I came away with some fresh perspectives on my work here at the RAC. The summit consisted of both practical talks about specific methods and tools, as well as wider reflections on ethical considerations and trends in the field. Continue reading

Project Electron Update: Systems Integration Architecture

The underlying architecture that enables the movement of data between systems is a key aspect of Project Electron. In our project values, we talk about components as modular and generalizable, independently deployable and  flexible enough to accommodate integrations with changing systems. The project value to “support data in motion” recognizes the strength of duplicate and distributed data, and articulates Project Electron’s approach to systems as points at which humans interact with or manage that data. All of this is to say that our strategic decisions relating to choosing an approach to system architecture, particularly with regards to systems integration, is essential to the project’s success and sustainability. In this post, I’ll share some of our current thinking around the various systems integration models and our considerations in choosing an approach that will enable these integrations of archival applications. Continue reading