Short Course on Computational Economics |
Overview This is a short course with 3 hour lectures from 9AM until Noon starting Tuesday May 27th and running until Saturday May 31st. After lunch each day there will be a 3 hour lab session where you can get hands on experience implementing concepts taught in the preceding lecture. I will be providing example code for solving various problems and solving various tasks and generally illustrating the power of modern computational technology for providing practical solutions to a wide range of problems in empirical economics, econometrics, and economic theory (both micro and macro and both "single agent" problems as well as "multi-agent" and equilibrium problems. |
The world is changing very quickly with the rise of the internet, e-commerce, and huge online
databases. I will start out the course by discussing some almost scary predictions
about a fast approaching "singularity" that is a consequence of exponential
growth in technologcal progress, including Moore's Law.
As a concrete example, I will discuss progress in supercomputing,
since in January, I was part of an NSF Panel
charged with advising NSF on awarding a contract to construct the next generation national
supercomputer that will perform in excess of 1 Petaflop. Thus,
the computational hardware we have access to today is truly amazing. After some opening philosophical remarks on trends in technology, I will turn to practicalities, since increasingly, knowing something about how to do parallel processing and at least elementary computer programming will have huge payoffs. Doing empirical work in the future will be greatly assisted by knowing a few useful computational tools: a) relational databases, b) web-database interfaces (e.g. cgi-bin), c) programs for parsing/extracting data in text and other formats (e.g. Perl, PhP, etc). The purpose of this first lecture is to give you an overview of the types of amazing hardware and software out there these days and to show you some of the amazing things that can be done with this, how this has and will transformed our daily lives, and how it is transforming science and the way we think about and do science. |
Anyone starting out in computational economics should have a firm grasp of the power and generality of the method of successive approximations and Newton's Method for computing fixed points and solving nonlinear systems of equations. These methods and and variants thereof are the basis of the vast majority of all numerical methods currently in use, including optimization methods and methods for finding equilibria. Beyond that, the most important single tool to know to do economics these days is to understand how to do dynamic programming. I will presume students have some familiarity and review methods for dynamic programming, and then discuss applications where solutions to dynamic programming problems are nested as subroutines within an "outer" optimization or equilibrium problem. |