Course Schedule
Date | Event |
---|---|
Jan. 20 |
MLK Day (no class!)
|
Jan. 22 |
Welcome to Data Science Discovery
The next BIG thing at Illinois is Data Science and it starts with Discovery!
|
Jan. 24 |
Data Science Tools
Data, Science, and Tools all have meaning in their own, explore how one relates to another and how they all related to Data Science DISCOVERY!
|
Jan. 27 |
Experimental Design and Row Selection (pandas)
Does the death penalty have a deterrent effect? Is chocolate good for you? What causes breast cancer? All of these questions attempt to assign a cause to an effect. A careful examination of data can help shed light on questions like these.
|
Jan. 29 |
Blocking and Conditionals
Random assignment to treatment and control works best to make the groups as alike as possible. With enough subjects, random differences average out. But what do you do if you have a small sample? Blocking first, then randomizing ensures that the treatment and control group are balanced with regard to the variables blocked on. We can use conditionals in pandas to help us do this!
|
Jan. 31 |
Confounders and Observational Studies
For years observational studies have shown that people who carry lighters are more likely to get lung cancer. However, this does not mean that carrying lighters causes you to get cancer. Smoking is an obvious confounder! If we weren’t sure about this, how can we determine whether it’s the lighters or the confounders or (maybe some combination of both) that is causing the lung cancer?
|
Feb. 3 |
Simpson's Paradox and Stratification
Stratification is often called the "blocking of observational studies" and allows us to use stratification to further explore observational studies.
|
Feb. 5 |
Measures of Center and Spread
Parameters are numerical facts about the population. In this lecture, we will look at parameters such as the average (µ) and standard deviation (σ) of a list of numbers. Later, we will start talking about statistics. Statistics are estimates of parameters computed from a sample.
|
Feb. 7 | Boolean Logic and Conditionals |
Feb. 10 |
Grouping Data (pandas)
A groupby operation involves some combination of splitting the object, applying a function, and combining the results. This can be used to group large amounts of data and compute operations on these groups.
|
Feb. 12 | Data Visualization |
Feb. 14 |
Bar Graphs and Histograms
Large tables of numbers can be difficult to interpret, no matter how organized they are. Sometimes it is much easier to interpret graphs than numbers.
|
Feb. 17 |
Quartiles and Box Plots
Just like histograms, box plots are used as a way to visually represent numerical data. They do this through selected percentiles which are given special names.
|
Feb. 19 |
Algorithms to Solve Complex Problems
An algorithm is a step-by-step, detailed set of instructions to solve a problem. An algorithm can be expressed as English sentences (usually as a numbered list) and is a great way to begin solving complex problems.
|
Feb. 21 |
Introduction to Probability
Probability is the likelihood or chance of an event occurring. This begins a multi-week journey discovering probability and how to simulate probabilistic events.
|
Feb. 24 |
Introduction to Probability II
Probability is the likelihood or chance of an event occurring. This continues a multi-week journey discovering probability and how to simulate probabilistic events.
|
Feb. 26 | Simulation |
Feb. 28 |
Midterm 1 (CBTF) happens this week - No class on Friday!
|
Mar. 2 |
Addition Rule + Conditional Probability
The conditional probability of an event B is the probability that the event will occur given that an event A has already occurred.
|
Mar. 4 | Functions in Python and Conditional Probability |
Mar. 6 |
Bayes Rule
Bayes Rule allows us to express a conditional probability as the inverse, often making the problem easier to solve.
|
Mar. 9 |
Simulation Analysis + Images
Simulation allows us to understand the outcomes of uncertain events. We will begin with basic simulations and build up to more complex simulations throughout this semester.
|
Mar. 11 | Images + Random Variables |
Mar. 13 |
Discrete Random Variables, Bernoulli, and Binomial
Any outcome that has exactly two outcomes with a fixed probability is called a Bernoulli distribution. The Binomial Distribution with parameters n and p is the discrete probability distribution of the number of successes in a sequence of n independent experiments. For a single trial (n=1), the binomial distribution is a Bernoulli distribution.
|
Mar. 16 |
Spring Break
|
Mar. 18 |
Spring Break
|
Mar. 20 |
Spring Break
|