Log in

Sign up for our mailing list!

When & Where
Date: 
Wed, October 31, 2018 - 12:00 PM to 2:00 PM
Location: 
Barrows 356: D-Lab Convening Room
Description
Type: 

This hands on workshop goes through the common “preprocessing recipe” that is used as the foundation for a variety of other applications as well as some basic natural language processing techniques.  These include: a) digitization (utf 8), b) removal of stopwords, numbers, punctuation, c) tokenization, d) calculation of word frequencies / proportions, e) part of speech tagging, and f) concordances.

Prior knowledge: We will be using the NLTK Python package, so basic familiarity with Python is required if you wish to follow along with the tutorial. Completion of D-Lab's Python FUN!damentals workshop series will be sufficient.

This workshop is one of a three-part series that will prepare participants to move forward with text analysis research, with a special focus on humanities and social science applications. Please register for each workshop separately.

Getting started & software prerequisites:

We will learn how to implement text analysis methods with Jupyter Notebooks.

To run the code on your computer, you will need to have Python 3 installed as well as some additional libraries. Anaconda is a free product that makes the installation process easy. It bundles together the Python language and a whole bunch of additional packages that we often rely on in our workshops. This way, you only have to download and install one thing. To use this method, visit this site and follow the instructions for your operating system to download the Python 3.x version (it might be 3.6, or 3.7, or higher). Please, please, please download the 3.x version, not the Python 2.x version. You may have a choice between using the graphical installer or the command line installer. Use whichever you're comfortable with, but the graphical one is easier.

 

Details
Training Host: 
Format Detail: 
hands-on, interactive
Participant Technology Requirement: 
Laptop
Log in to register for this training.