Part 4. Digital Tools Explained

5 Essential Categories of Digital Tools

When engaging in Digital Humanities, I have found it useful  to have a working knowledge of select tools in order to perform research and present your findings. You can always get specific training if you start work on a specialized project. There are a multitude of projects and even more sophisticated tools and methodologies. These basic tools are meant as an introductory to DH–more specifically,  data analysis and storytelling. This chapter explore five broad categories of tools needed to get started in Digital Humanities.

  1. Spreadsheet tools are applications for organization, analysis, and storage of data in tabular form. Spreadsheets are also ideal for cultivating, saving, and archiving datasets. Datasets are the foundation for producing stories that draw on large quantities of items.
    • Microsoft Excel is an electronic ledger, an electronic version of paper accounting worksheet where users can store, edit, and analyze datasets.
  2. Data visualization tools are computer programs that convert numeric and textual information into tables, figures, charts, and other images.
    • Tableau Public is a free platform to publicly share and explore data visualizations online
  3. Data scraping tools refer to computer programs and computational methodologies used to extract large bodies of content or datasets from website or online platforms. Data scrapping makes it possible to retrieve sizable quantities of information from one site in order to study or present in other contexts.
    • Google Sheets Formula allows you to extract specific information from spreadsheets embedded in websites using special formulas with Google Sheets.
  4. Data cleaning tools refers to programs that adjust elements of a dataset in order to ensure that items are accurate and adhere to a standard format. Data scrapping and other processes that combine varied sources of information often misalign items in a dataset. To create cohesion and accuracy, it is necessary to revise or “clean” the data.
    • OpenRefine is an application for data cleanup and transformation to other formats.
  5. Text mining tools refers to programs that explore the transformation of elements in document, such as words in a novel or words in several novels, into structures based on type, quantity, or other arrangements. Text mining is especially useful for people interested in patterns of word and language use in written compositions, speeches, and social media posts, to name a few.
    • Voyant Tools is an open-source, web-based application for identifying and tabulating word patterns based on a single document or collection of texts known as a corpus.
    • Topic Modeling Tool learns topics in a collection of documents, and tags each documents with a small number of topics using a LDA algorithm.

In this section, assistant editor John Merritt, Jr., provides key insight into data scraping & cleaning tools, as well as topic modeling.



Icon for the Creative Commons Attribution 4.0 International License

The Data Notebook by Peace Ossom-Williamson and Kenton Rambsy is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book