As you may have seen in today's Chronicle of Higher Education, the NEH has just announced our new Humanities High Performance Computing initiative -- HHPC for short. Our goal is to start a conversation about how high performance computers -- supercomputers -- can be used for humanities research. We are working with colleagues at the Department of Energy and the National Science Foundation to provide you with information on how high performance/grid computing and data storage might be used for work in the humanities. We are also announcing a new grant competition with DOE to award time and training on their machines. I urge you to check out our HHPC Resources page for more information.Personally, I'm very curious to see where this new initiative takes us. It started over a year ago when the Office of Science at DOE approached us about making supercomputers available to humanities researchers. We invited a number of humanities scholars and supercomputing specialists to meet with us here at the NEH last July so we could think hard about this issue. Certainly, I think that everyone in attendance agreed that there are only a limited number of humanities projects today that require high performance computing. But one of the things we learned from colleagues at NSF and DOE is that this was the also the case in the sciences in the not-too-distant past -- scientists also had to learn about supercomputers before they could begin applying them to their work. Computation has proven an effective tool for scholarship and while supercomputers may only be useful for a small slice of the humanities today, I think it is safe to say that slice will grow in size over time. So think of this HHPC initiative as way of opening doors; a way of starting conversations and getting scholars, computer scientists, and information scientists talking about ways in which their fields might work together.