That seems like a slightly silly thing to do until you consider the way things are going. Fancy web pages are suddenly all about Ajax, and while that's conceptually straightforward, it's a pain in the ass to code and more of a pain to integrate into the server side of one's site. So the hot idea is that you write your entire site in Java or C#, both the server and client, and then use these "compilers" to turn the client side into Ajax-based web pages.
So the question is, how long will it be before a true pseudo-machine intermediate language shows up for browsers? History indicates that, when you find yourself compiling into a high-level language, that's almost always an evolutionary step on the way to developing a more appropriate low-level language for the situation. I don't think that's certain in this case (heaven knows the Net has shown a remarkably cavalier disregard for efficiency in its standards), but it seems plausible. A standard pseudo-machine would be rather useful in a number of respects -- it would be more compact, run faster, be easier to compile into and would give folks an opportunity to get the semantics a little more precise this time around.
It'll be interesting to see if it happens. The only reason I can see this *not* happening is that it would require the IE and Firefox camps to agree on a new standard, and that alone could tie things up for years. But it seems like an idea whose time is coming...