I attended a "hot topics" session yesterday on cyberinfrastructure, largely to figure out the meaning of this somewhat fuzzy concept. It was introduced by the moderator as a somewhat rag tag group of computing functions: cluster computing, digital libraries, data visualization, high speed networks, etc.
Gradually, it became apparent that many folks in the higher ed IT world see this as a movement to play a higher profile in the academic missions of their institutions by bringing integrated high performance computing resources to a wide range of academic disciplines. There was a lot of talk about how IT departments, now, focus on mundane administrative systems, the "plumbing" of the institutions. This is seen as an effort to make IT more strategic, to move from "administrative dweebs" who put up barriers to research (firewalls) to folks who "get stuff done" for faculty. This has strange echos for me of conversations among academic librarians regarding ways to become more relevant, to get involved in a deeper way in research and teaching.
There wasn't as much talk as I expected on the role of extra-institutional systems in supporting this, including systems supported by the commercial sector. At least on the library side of things, we're seeing some of the most powerful, web scale digital library systems (Google Books) in the hands of a commercial vendor.
In my view, the cyberinfrastructure question, especially the adoption of high performance computing in the research process in disciplines that are less comfortable with computing, has a lot to do with raising expectations regarding what scholars expect from IT and what they expect from themselves in terms of research methodologies. I doubt we'll see philosophers clamoring for these things, but I'm sure our geologists, sociologists, and historians will be there soon.