Tuesday, September 11, 2012

Proposal for a Technology Capacity in the Trinity Curriculum

This is a follow on to my previous blog post about the curricular proposal that recently came out at Trinity. On Friday 9-7-2012 there was a meeting of the faculty to discuss the proposal. It came out fairly quickly that the committee who designed the proposal acknowledged the not addressing technology was a hole in the current proposal and that it needed to be addressed. They also mentioned an addendum for a "Capacity" in technology that had been given to them shortly after the first time they had presented the proposal to another group of faculty (the University Curriculum Counsel), but they hadn't had sufficient time to integrate it, or even decide if they liked it. In this blog post I want to lay out my idea for how technology should be incorporated, which I believe mirrors the existing addendum closely. I will provide my reasoning, more details than are in that addendum, and also consider the feasibility based on faculty resources.

Why?
I want to start off with a quick recap of why I think that including proper technological proficiency in Trinity graduates is essential today and will grow in importance moving forward.
  • The world is run by computers and technology. Digital processors running software control your financial transactions, your utilities, your car, and pretty much everything else you use on a daily basis.
  • This is an area where primary and secondary schooling utterly fails. For example, students in Texas are required to take 4 years of English, Math, Science, and Social Studies. For foreign language it is a 2-year requirement and the higher level graduation plan needs three years. One can debate how effective their efforts are, but there is some foundation in all of these areas. On the other hand, Texas students are required to take NOTHING related to understanding and utilizing technology. (In fact, the way it is included in the curriculum discourages the types of students who would attend Trinity from taking it.) In a way, this makes technology education at the college level something of a remedial subject. However, it is clearly important to the future world and we have to make sure our students aren't completely ignorant at graduation.
  • Computers are tools that can solve certain problems that humans are very poor at. This isn't really about technology. This is about problem solving, and having enough knowledge to be able to identify, and hopefully use, the proper tool for solving various problems. With the growth in data sets, especially publicly available data sets, finding correct answer to more and more problems is becoming something that computers are really good at it and humans aren't.
What?
So what should our technology requirement be asking student to do? You can probably tell from what I wrote above that my real goal is to get students to the level of proficiency that they can use a computer to solve problems in fields that are relevant to them which are not easily solved by humans by hand. Most of these problems involve either numerics or data processing of a scale that simply puts them outside the reach of unaided humans, but they are things that a computer can finish almost instantaneously if given the right instructions.

I think that by aiming for this objective, these courses will implicitly give students another thing that I feel is absolutely essential, a sufficient comfort level in working with technology. Comfort in working with technology is essential for so many aspects of modern life, and as computing power becomes more ubiquitous, that is only going to grow. However, I don't think this is what courses should focus on. This should instead be something that falls out of the mix when we force students to use technology to solve other problems.

For me, the ideal way to do this involves programming. It doesn't have to be serious programming, but it needs to involve putting together logic using the basic logic structures that have been part of the programmers toolkit for decades. I would argue that learning how to program changes the way a person views every problem they encounter in a way that is far more fundamental than learning a foreign language. When you program, you have to really break a problem down and figure out how to express that problem in terms that a computer can understand. So programming is, in many ways, a translation problem. You translate from normal human terms to the more formal terms and syntax of a programming language.

While I think that the programming part is critical, the way in which it is done is far less important to me and should be selected to agree with the problem area. At the faculty meeting to discuss this, someone made a negative comment about courses teaching Excel. If all a course taught was basic Excel, I would agree. However, there is a lot to Excel that goes beyond the basics. Since the goal is to focus on using technology to solve problems, and the problems should be of sufficient complexity that basic Excel won't do it, I would be perfectly happy with a course that uses Excel and has students write VB Script to implement more complex logic. Indeed, if the data sets that are associated with that course/topic tend to be tables of data, they probably come in either Excel or CSV format anyway, and then Excel isn't just a suitable choice, it probably becomes the ideal choice. (Other spreadsheets would work too. For example, the spreadsheet in Google Docs also has a scripting environment.)

The reality is that tools, whether they be Excel or something else, change over time. That is part of the nature of technology. That is also why courses should not focus just on tools. If a student takes a course in his/her first year and that course focuses only on tool usage, it is possible that tool won't even be available or supported by graduation. However, whatever other tools will inevitably be used to solve those problems will inevitably use the basic knowledge of programming/scripting. So this skill/knowledge translates well across pretty much all tools because the nature of programming has shared elements across all languages. In a sense, there are certain constructs that are used to describe algorithms, just as things like past and present tense exist across all natural languages. By focusing on problem solving and forcing students into more challenging problems that require going beyond basic tool usage, we get to the logic elements that persist across time even as tools and technology change under them.

How?
So how do we make this happen in the curriculum? For me, the main point is that the majority of students do this in a course in a department other than Computer Science. The key is that the computation should have purpose and context. There should be problems associated with a particular subject or line of study. When students take a course in CSCI at Trinity, we can throw some problems at then and give it some context, but everyone in the room has different interests and in the Computer Science department, we are often interested in the nature of computation itself more than the problems it can be used to solve. (This is much like the difference between pure and applied mathematics. Almost no one outside of mathematics cares about pure math until some application is found for it to help them solve a problem.)

So these would be courses taught in other departments and to get approval for satisfying this capacity, they would have to demonstrate how solving problems through technology fits into the syllabus. Some of these certainly exist. Certainly CSCI courses would qualify, but I think there are probably quite a few others around campus in departments like Communications and Urban Studies as well as some upper level STEM which also do this without modification. More will be needed though. I think many of these courses could be easily created from existing courses that have been enhanced with assignments/projects that wouldn't have been possible without the technology addition. For existing courses that already use technology for problem solving, they could work with their current hour allotments. For courses that need this added on, I would not want to see the computing elements cut into their normal content. Instead, I would rather see an extra hour added for that purpose. That extra hour would include the time where students learn how to use the technology for problem solving as well as where they will find whatever information (such as data sets) to use in the process. So a lot of the courses that satisfy this would go up to being 4-hour courses under the current system. It might also be possible to have the technology be an add-on lab that only some students would take. That might not work well in many cases, but allowing it as an option would be very helpful for those situations where it does work.

The situation where additional problems are added to a class that involves using technology to solve them is where resources really come into play with this proposal. If Trinity can't actually enact and sustain a proposal, then it doesn't matter whether or not it is any good. Clearly, the courses that already satisfy the requirement require no new resources. However, that will likely be a small fraction of the total. Most of the seats for this requirement would need to come from courses that are augmented with computation and the faculty teaching those might well need some assistance to do that.

How many seats are needed? I personally think that students would benefit most from having to take 2-3 courses that fulfill this requirement. The hope is that students would see slightly different approaches. That helps them to abstract the ideas and see how to apply them more broadly. For every course that is required, we need ~650 seats/year. Courses are typically 20-30 students so it is reasonable to say that we need about 30 sections of these courses each year for every course that is required. That means anywhere from 15-45 courses/semester to have 1-3 of these in the graduation requirements.

Is this doable? I think so and I will go into detail below. First though, I can already see some people objecting that there is no reason there should be 3 computing/technology courses required. However, I would remind anyone making that objection that these aren't computing/technology courses. These are courses in subjects from around the campus which include a significant assignment or project which highlights using technology to solve a problem in that field. There are over 20 departments on campus so even requiring three of these courses for graduation only implies that each department offer ~2 such courses per semester.

(If my department chair sees this blog he should probably stop reading at this point.)

Where things get harder when it comes to resources is the fact that not all faculty will feel comfortable putting this type of content into their class. Even faculty who might be able to find great data sets online, and who want to have their students process/mine those data sets for interesting information might not feel comfortable with the responsibility of giving the students the capability to do that type of technology based problem solving. I don't think they should have to do it alone. I can't volunteer CLT to help with this because they have other tasks. However, the CS department and the instructors housed in it who currently teach IT skills could likely provide some support for this.

Currently the CS department teaches ~3 sections of CSCI 1311 each semester and the IT Skills instructors in the department teach ~7 sections of 1300. That is 30 contact hours per semester currently devoted to the CC. Some of those sections would probably be kept untouched, but in a very real sense that is enough human time to assist with one hour of credit for up to 30 courses each semester. In addition, early efforts to do things like prepare video lectures that cover this material could make it possible to get students up to speed with the skills that they need to solve the problems in question with less direct involvement from faculty in that aspect of the course.

In summary, the reality of the modern world is that computers run everything and students need to have some knowledge about how the software that runs those computers works. They also need to know how to make the computers solve problems for them in situations where the computer can do it better than a human. This should be done in the context of topics the students are studying for other reasons, not just because we are twisting their arms to code. We have the resources to make this happen. It just takes a little will in the part of the faculty. The result will be much stronger students who are more ready to enter the world of 2022.

No comments:

Post a Comment