Upcoming Deadlines
Final grades released:
- You can view your final course grade in Compass 2g!
- Have an amazing break!!
Lecture: Storytelling and Data Visualization
Lecture: t-tests
Lecture: Hypothesis Testing
Hypothesis Tests are statistical tests to see if a difference we observe is due to chance. Many times, we have competing hypotheses about the value of a population parameter. It’s impossible or impractical to examine the whole population to find out which hypothesis is true, so we take a random sample and see which hypothesis better supported by our sample data.
Lecture: Midterm 2 (CBTF) - No Class :)
Lecture: Residuals + RMSE
Lecture: Residuals, RMSE, Regression in Python
Lecture: Scatterplots, Correlation, Simple Regression
Lecture: Sampling
We take a sample to find out about a larger population. We usually don’t have the resources to gather information on everyone in the whole population so instead, we select a small sample and use it to make inferences about the larger population.
Lecture: Central Limit Theorem
The normal approximation for random variables amounts to taking advantage of the Central Limit Theorem. We replace the true probability histogram for the sum, average, or percentage of draws by the normal curve before computing areas.
Lecture: Normal Approximation
The normal curve is a bell-shaped "ideal" histogram that many histograms resemble. Many histograms are close to the normal curve. For these histograms, you can use the normal curve to estimate percentages for the data.
Lecture: Discrete Random Variables, Bernoulli, and Binomial
Any outcome that has exactly two outcomes with a fixed probability is called a Bernoulli distribution. The Binomial Distribution with parameters n and p is the discrete probability distribution of the number of successes in a sequence of n independent experiments. For a single trial (n=1), the binomial distribution is a Bernoulli distribution.
Lecture: Addition Rule + Conditional Probability
The conditional probability of an event B is the probability that the event will occur given that an event A has already occurred.
Lecture: Midterm 1 (CBTF) - No Class :)
Lecture: Probability, Birthday Problem, and Control Flow
Lecture: Introduction to Probability + Monty Hall
Probability is the likelihood or chance of an event occurring. This begins a multi-week journey discovering probability and how to simulate probabilistic events.
Lecture: Algorithms to Solve Complex Problems
An algorithm is a step-by-step, detailed set of instructions to solve a problem. An algorithm can be expressed as English sentences (usually as a numbered list) and is a great way to begin solving complex problems.
Lecture: Quartiles and Box Plots
Just like histograms, box plots are used as a way to visually represent numerical data. They do this through selected percentiles which are given special names.
Lecture: Bar Graphs and Histograms
Large tables of numbers can be difficult to interpret, no matter how organized they are. Sometimes it is much easier to interpret graphs than numbers.
- Perception of Probability Words Survey (+1 EC)
- CBTF Exam Registration
- Lecture Handout
- Extra Credit Notebook (+1)
Lecture: Grouping Data (pandas)
A groupby operation involves some combination of splitting the object, applying a function, and combining the results. This can be used to group large amounts of data and compute operations on these groups.
Lecture: Boolean Logic and Conditionals
Lecture: Measures of Center and Spread
Parameters are numerical facts about the population. In this lecture, we will look at parameters such as the average (µ) and standard deviation (σ) of a list of numbers. Later, we will start talking about statistics. Statistics are estimates of parameters computed from a sample.
Lecture: Simpson's Paradox and Stratification
Stratification is often called the "blocking of observational studies" and allows us to use stratification to further explore observational studies.
Lecture: Confounders and Observational Studies
For years observational studies have shown that people who carry lighters are more likely to get lung cancer. However, this does not mean that carrying lighters causes you to get cancer. Smoking is an obvious confounder! If we weren’t sure about this, how can we determine whether it’s the lighters or the confounders or (maybe some combination of both) that is causing the lung cancer?
Lecture: Blocking and Conditionals
Random assignment to treatment and control works best to make the groups as alike as possible. With enough subjects, random differences average out. But what do you do if you have a small sample? Blocking first, then randomizing ensures that the treatment and control group are balanced with regard to the variables blocked on. We can use conditionals in pandas to help us do this!
Lecture: Experimental Design and Row Selection (pandas)
Does the death penalty have a deterrent effect? Is chocolate good for you? What causes breast cancer? All of these questions attempt to assign a cause to an effect. A careful examination of data can help shed light on questions like these.
Lecture: Data Science Tools
Data, Science, and Tools all have meaning in their own, explore how one relates to another and how they all related to Data Science DISCOVERY!
Lecture: Welcome to Data Science Discovery
The next BIG thing at Illinois is Data Science and it starts with Discovery!
- Lecture Slides
- Lecture Handout
- Join the course Piazza
- Register your iClicker on Compass 2g
- Day 1 Survey (+1 EC)
Welcome to Data Science Discovery!
Our first lecture is Monday, Aug. 26 at 12:00noon in Lincoln Hall Theater. See you there! :)