Pages

Thursday 14 January 2016

Intermediate level Python workshop - part 2

In our planning from the previous post, we've established three major areas of learning that we can pick and choose from. The next step is to establish basic core exercises for each of our three topics, and write down the pre-requisite knowledge for each exercise (NEVER underestimate pre-requisite knowledge!) This will also help us design the revision part at the start of the workshop.

I think coming up with good exercises is the most creative and difficult part of designing and workshop. I'm not looking forward to considering and chucking away hundreds of ideas because they have some essential flaw. We'll see what we can do though.




We'll have to end up discarding some topic areas. In an ideal world I would be prepared to teach all three topic areas, and I could pick and choose and adapt based on the needs of the class. However, the world is not ideal, and I'm crap, so we'll have to discard either image generation or audio generation, or maybe even both. We'll see how the pre-requisites pan out.

First port of call is to plan the sequence of delivery for each topic. The main part of the workshop will be the web scraping and data visualization, which is pretty cool.

Plan:

- Revision of all pre-requisite knowledge using exercises of slowly increasing complexity

- Introduction to web scraping theory. Ideally, a very simplified model that helps to explain all the things that will be going on when they make a request to a website. Maybe just the client-server model is enough? The problem though is coming up with an interesting, engaging, and simple way of presenting the concept.

- Basic web scraping exercise. This is really hard, since it's super difficult to come up with an exercise that you couldn't do by yourself by accessing the webpage using the web browser. I'm thinking that you can use Python's neat string in-builts to do something like a word count. I might have to cover the concept of HTML incredibly briefly (something like "extra code to control the appearance of the text")

- More complex web scraping exercise. This would be a more sophisticated program that would integrate the web scraping with things like variables to collate the data rather than just print it out ad nauseum. Ideally this would involve loops so that the students get to revise loops again. Maybe scrape data from multiple URLs or something.

- Basic data visualization exercise. Ideally this would also include use of variables (e.g. for summation or average calculation).

- More sophisticated data visualization exercise. This one would use graphs which are closer to the kinds of graphs in the final project.

- Web scraping revision exercise

- Final project: Combine web scraping and data visualization. The whole workshop ideally will lead up to this point where the students may be able to visualize scraped data by themselves. In the beginner workshops, we generally teach by giving them example programs and getting them to expand and extend on those.


Usually, getting the students to write something from scratch will simply lead to a lot of confused students, because they won't remember the stuff that they've learned before, or they've forgotten the exact syntax. However, I'm going to bet on the students for the final combined exercise. I think if I provide a syntax cheat-sheet, the students will be able to go off and create their own program from scratch, without an example program to work off.

Since this is an intermediate workshop, I think I can risk the students being confused. If I can successfully pull this off and they are able to construct their own programs from scratch, then they'll feel empowered - they'll feel like they can actually program! Which is really important for high school students wondering what they're going to do with their life, and wondering if they can really be a programmer (probably affects girls more I think). This is an objective that I really want to try to aim for in the upcoming workshop.


I'll also need an optional module near the end to possibly fill time if the students are running out of things to do. Since there's no reply from the IT administration about installing PIL or pillow, I'll have to go with the audio generation as the optional extension for now. However, with unforeseen pre-requisites popping up left and right, somehow I doubt that I will need that optional module.

Coming up with solid web scraping example exercises is actually damned hard. On a more optimistic note, however, the IT department managed to get pillow and requests installed for me! Thanks, IT department. Now we have all the libraries that we need.

No comments:

Post a Comment