

I attended the
Singularity Summit 2007 this past weekend in San Francisco.
Here's a good summary of the sessions. While I was there, I got a chance to speak with
Steve Omohundro,
Peter Norvig,
Barney Pell, and
Sam Adams.
The
X-Prize Foundation announced their plans to enter the education space. They want to provide an Education X-Prize, and they were looking for suggestions from the audience at the summit on how best to frame the problem.
Here are some thoughts:
There is no general consensus as to when a singularity will happen, if it is likely to be beneficial or destructive to human life, or if one will happen at all. There isn't really an agreed-upon definition of what a singularity is. But that's the point of this summit: to define what we're looking for and how to influence it positively. And to raise awareness of a potentially huge issue. I think
Ray Kurzweil's writing (like
The Singularity is Near) continues to be a good reference, especially for people just joining the singularity discussion.
There seems to be a fair amount of venture capital available for startups in the artificial general intelligence (AGI) space. (Several investment firms were represented at the summit, including
Clarium Capital, the
John Templeton Foundation, and
Draper Fisher Jurvetson.) Also, there are several stealth-mode startups working on AGI technology and products.