As a part of our work in the History of Texts and Technology course, students were tasked with learning about version control and how it is utilized in collaborative projects. In order to do this, our assignment was to work together to create a sample Introduction to Texts and Technology course syllabus and we had to create it by sending edits back and forth on GitHub so that our selected leader could compile the additions and changes into one final syllabus. This was an important assignment for me professionally because I am planning to enter the industry after completing my PhD, and being able to use GitHub is a key skill that most people within the industry assume its applicants are able to use.
In my future academic projects, there will most likely be a collaborative element in each project I will be involved in. My research interests include augmented reality, digital media and the work I am doing within both of these fields so far has been collaborating on larger project where I have a small piece of it. As I get involved in future projects, especially ones involving computer programming or some element of coding, those projects will most likely be worked on through GitHub. Also, a lot of programming projects are carried out remotely, so being able to use GitHub is an essential skill so that I can even be a competitive applicant.
While I am sure there are other collaborative platforms that are popular, especially within academia (such as using Google Docs for writing projects), I think GitHub can be used to accomplish a variety of projects. I plan to use the GitHub platform as both a means of contributing to projects and as a portfolio that I can refer potential employers to. The collaborative and individual projects I list on my GitHub account will show employers the level of my abilities better than my resume can.
Through the modules on version control (and specifically using GitHub) the main things I learned were why version control is important in a collaborative project and how to use GitHub. The modules explained clearly that using version control help to track the changes each person makes on a project – this can be most helpful when a project has been divided into pieces and the person in charge has to track who has made changes to each part, as well as which pieces have not been completed. Getting into the nuts and bolts of the modules, I learned how to actually use GitHub – creating an account, installing and configuring GitHub onto my laptop, editing a file, pushing my changes, and creating a pull request so my changes can be merged with the main file for others to see. Overall, this was a very useful assignment for me and helped me to build my skills and confidence so I am better equipped to work on Digital Humanities projects in the future.
Chapter 3: The Social Life of the Digital Humanities (DH)
- Focuses on social aspects & impacts of DH
- Analyzing economies humanities were created in
& how it’s changed (with open-source models, IT, and social media)
- These models have reshaped contemporary practices and promote social transformations that have an affect on the relevance of humanities work
- 2 economies this article contrasts: the 1st Industrial Revolution and the globalized economy of this networked information age
- Viewing universities as an ‘ivory tower’ doesn’t work anymore – it’s becoming more of a “nodal point” amidst the constant change
Chapter 4: Provocations
- Humanists are working across the digital/analog environments (it’s slowly becoming a more blurred line)
- Complex adaptive systems theory (p.106)
- Much of DH work is team-based (like in STEM fields) and involved bringing in different experts to work on projects together
- Large amounts of historical materials are being
collected/documented now – resulting in a backlog of work
- Solution: new models for preserving and processing the work be created and models become user-centered rather than object-centered
- Creation of digital archives – can become dated, each piece of information is effected by each interaction involving it, helps tackle cultural memory
- Comprehensive models for graduate training in
humanities has gained more traction with the increase in the number of
- Doctoral students now become experts in a specific area and determine how they can tie that specialization into other areas (theory is often used as a bridge between areas)
- Because outside skills are becoming more incorporated
into careers today (computer programming skills, library science, digital
media/design skills) – making the job market more diverse and making it even
more complicated to find
- Question: How do fellow grad students think the job market is going to change as we continue in our degrees? What skills will be more highly valued?
- Tech skills and research questions aren’t synonymous, but with the growth of DH it’s more & more important to understand how having the tech skills can assist in research
In this blog post, I am focusing on the code, software, and platform studies aspect of Digital Humanities (DH) research. In terms of the Veterans Legacy Project (VLP), this “emerging area” carries a lot of potential impact. VLP is a growing project in terms of its scope and the projects that fall under its umbrella. Its projects focus on spreading information to visitors to cemeteries, veterans, and there are projects specifically for teachers to use with their students. As a PhD student, I have been continually seeking out a better understanding of coding and programming so that I will be able to use it in my dissertation research as well as be more employable after my degree is complete.
Burdick also goes on to discuss the scholar’s point of view in regards to the use of encoding in technology and other fields: “Scholars fascinated by the encoded protocols and instructions that constitute the language of software also look at the cultural contexts in which business, defense, or communications industries fueled the development of increasingly sophisticated approaches to encoding” (pg.53). This fascinated perspective could offer further support for the need for recruiting experts to DH projects so there will be a professional that is able to keep the potential uses of encoding and coding in perspective in terms of long-term and short-term goals. As a budding coder and web designer, there have been multiple instances of associates telling me stories of having to manage a client’s expectations because the client wanted a final website produced in the fraction of the time it would actually take.
In going forward in consulting with VLP, it has been important to understand that any work created throughout the iteration of this course would be a draft or beta test that would be fleshed out and built upon throughout the summer. As it is stated in the fifth case study of the Burdick piece, “The prototype will be employed to beta test a new way of accessing information, interacting with knowledge, and experiences data research in physical and virtual space” – any product created for VLP would have to go through beta testing as well to ensure it is functioning correctly without any nasty surprises. Besides consulting for VLP, any coding and programming that I do for my own research projects would also be completed in a similar process of creation, beta testing, debugging, and finally release/publishing. The Digital Humanities is a field with enormous potential to incorporate technical experts into the projects, and this would probably improve the field overall. If a project team included a humanities expert working with a software developer, and the two were able to understand the other’s perspective, the potential products could be unlimited.
Electing the House of Representatives 1840-2016, a project by the University of Richmond’s Digital Scholarship Lab, uses data visualization tools in order to show the outcomes of elections over 176 years. While I cannot speak for everyone, I do think there are large quantities of individuals that struggle with understanding the political layout of the United States and would like to better comprehend the political inclinations of each state. This project provides a highly developed interactive graphic that users are able to manipulate to learn about the outcomes of the House of Representative elections from 1840-2016. Among the various information a user can glean from exploring the image and manipulating the settings is the winner in each district, the strength of that victory, if it was a flipped victory in that district, and the map can be represented as either a cartogram or a map. Beneath the map is a timeline that shows the amount of seats Democrats and Republicans won in each election through history, accompanied by a short narrative statement as well.
Why is a graphic such as this so valuable when election results can be found on various new outlets? Politics are an important topic to be aware of yet they are also highly charged with anxiety and stress which can lead to greater difficulty for some individuals to keep up with. When a layman tries to learn about the history of his or her district in order to learn about the likely outcome of upcoming elections, an interactive graphic such as this one allows the user to directly connect with the data and creates a better understanding of the election results. The effective design of this project ensures the controls of the graphic are easy to use, the colors and images chosen are communicative (and do not clash with accessibility, such as color-blind users), and each element of the graphic is clearly labeled.
Looking at this project’s argument, it exhibits a clear aim of understanding the history of elections by placing more context around them – in the introduction of the project it is stated that, “This project aims to recapture the role of Congress as an equal branch in governing, worthy of studying side-by-side with the Presidency, by offering comprehensive and fine-grained data on the history of Congressional elections. To understand the most momentous periods of reform in American political history, we must give attention to all branches of government.” As a result of illustrating the data, the University of Richmond’s Digital Scholarship Lab has brought these past election results to life so that they can almost speak for themselves – with the data clearly displayed there is no way to deny when a state was flipped or when Republicans controlled more districts, or when Democrats were in the lead – this information cannot be hidden in reports or scoffed at as being dated when it is made into a graphic that could be found on any cutting edge, or high-end website. Perception is an incredibly important element in spreading information, and the acceptance of the information, and reinterpreting this election result data into an interactive graphic makes data from 1840 current.
In conclusion, Electing the House of Representatives 1840-2016 is an excellent example of a Digital Humanities (DH) project that should be used for a model in future projects. This project illustrates the potential for using technology (either via web design, data visualization, or interactive graphics) to bring historical data from centuries past back to life. Before DH projects started becoming more common, it was more difficult to revitalize historical information. Now there are growing numbers of examples of DH project focused on different facets of history.
Link to the Electing the House of Representatives 1840-2016 project:
The Veteran’s Legacy Program is a Digital Humanities (DH) project that memorializes and honors veterans and brings history to life for those seeking to learn more about the Seminole Wars and World War I. UCF’s History department and Center for Humanities and Digital Research (CHDR) have been working collaboratively to create a variety of digital tools and teaching materials for educators in K-12 education. A project of this scope and longevity contains multiple moving parts, I want to focus on the types of players involved in this DH project, how it incorporates skillsets outside of academic research/training, and what this project stands to contribute to society.
Digital Humanities projects generally involve a large amount of collaboration. Generally this includes professors, students, professionals within the targeted field, educators, and volunteers. The VLP project involves most of these collaborators, though most of the work is conducted through UCF professors, employees, and students. The collaboration within this project largely depends on which prong of the project is involved: the main professors in charge of the project oversee every element of collaboration while other collaborators focus on their portion of the project. Students within the Text & Technology department will work closely with involved educators to develop exciting and appropriate materials for the classroom in the hopes of enticing students interest. Other students involved with digital tools and representations (such as Tableau) will work more closely with Dr. French and Dr. Giroux to ensure their representations of the data convey the information in an effective manner.
One of the lovely elements of any collaborative project is its incorporation of a variety of skillsets and backgrounds – and the VLP Consulting Project is no exception. The overall aim of the project is the preservation and spread of information to everyone – child and adult. By having CHDR fill a role in VLP development, this does ensure that the historical information is presented in new ways outside of books and journal articles (as might be expected from traditional history-based projects). Students involved in the creation of the AR applications and digital representation of data are also able to bring a wide pool of knowledge as well because they see the project from a different perspective than the professionals running the project: they are a different age demographic, have different perspectives and thoughts regarding what is effective in absorbing new information, and also bring different academic backgrounds to this interdisciplinary project.
VLP stands to make a large contribution to the fields of History and Digital Humanities, and to museums/curation. This project doesn’t focus on some far away location or far away time that has no connection to society today – this is part of our history. The methods used in this project can serve as a model for ways history education can be updated in this digital age so that future students don’t write it off as ‘boring’ or ‘not important’. The augmented reality app being developed offers an innovative way for visitors to the cemetery to engage with history as they are now able to directly access information about the individuals buried at each cemetery from their smart phone.
My background is filled with varied experiences in Anthropology and Linguistics, and now a majority of my time is spent incorporating my liberal arts/social skills into technology. I think I could best help this project through public outreach – either via social media or working on interactive data visualizations. I have experience in conducting research in different fields and in different countries, and this have given me a strong understanding of how to manage projects containing multiple individuals – the organization and communication needed. So even if I did not work on a forward-facing part of the project, I believe I would still be a strong asset for helping to manage the backend side of VLP.
This week we are exploring big data in Graham et al.’s work Exploring Big Historical Data: This Historian’s Macroscope. Throughout this text, the main theme was exploring the digital tools that can be used by today’s historians (or already are in use). Among these skills is Zotero – a tool that enables users to save and export citations, which can be a lifesaver for researchers. On page 6 of Big Historical Data Zotero is cited as a tool for finding commonalities: “Using a plugin, a little program or component that adds something to a software program, for the open source reference and research management software Zotero, Fred Gibbs at George Mason University developed a means to look at specific cases (e.g. those pertaining to “poison”) and look for commonalities…Through comparing differences in documents (using Normalized Compression Distance, or the standard tools that compress files on your computer) one can get the database to suggest trials that are structurally similar to the one a user is currently viewing.” This is one example of the tools historians are using to conduct big data research in order to gain a better scope of ‘the big picture’ in historical occurrences. Using tools such as Zotero (which I picked because I also use it in my work) have made it possible for big data research to be conducted without the headache-inducing amount of resources it would have required before the availability of open source tools.
As Graham et al. state, “There are three issues of critical importance to understanding big data as a historian: the open access and open source movements, copyright, and what we mean by textual analysis” (p.38). While this quote outlines the topics that a historian needs to understand in pursuing big data, it also shows the limitless potential those tools possess to a historian capable of seeing ways of reimagining data to catch a person’s attention. Up until a short time ago, information was only reported with fairly uniform methods: in itemized tables and lists with accompanying reports arranged by topic (generally chronological). Now it has become acceptable to display data in new ways that can spark understanding in a variety of observers. Data is being analyzed and displayed into word clouds, in line graphs, and scatterplots (at times using colors to contrast different topics and their frequency) – these new means of data visualization allows historians to reach many more people. And this new incorporation of data visuals increases comprehension in users.
An excellent example of the applications of big data research and data visualization for historical research is the Viral Texts project. This project has several components: the Love Letter Exhibit, Fugitive Verses edition, and a visualization of the network of “Viral Text” sharing from 1836-1899 – these are a few examples of the work the Viral Texts project has done. Among these, I want to focus on the visualization for the “Viral Text” network: this interactive graphic allows users to zoom into the image and select nodes from within the mass of connections in order to isolate one node in order to see its information and which nodes with which it connects. Users are able to zoom in and out of the image to get a better view of the hundreds of nodes. I think interactive visualizations like this are able to communicate more information than a written report ever could – the interactive component is far better for keeping a user’s attention (especially if the user is not a history major and may have come across the website by coincidence).
The Digital Humanities is a field that is rapidly expanding, though I don’t think everyone knows what to do with it. Projects such as the Viral Texts project give in insight into what historians others within the Humanities can do to integrate their work into the ever growing tech world. By incorporating tools such as Zotero, Tableaux, AntConc, and Voyant Tools when publishing research, historians can better claim a platform in this digital age.