Sorry if this has been posted before. While reading some stuff on the Javalobby, I was pointed to an interesting interview hosted on the Artima.com website.
Really quite an eye opener.
What we've done in the past four or five months is a very massive refactoring of the IntelliJ IDEA codebase, to separate support for Java from the core platform. We have extracted functionality needed to support an IDE for some language: parsing, refactoring, code manipulation, and so on.
With that in place, we can start building products for other languages based on that platform. Because not all those products will include all the functionality of IntelliJ IDEA, they can be much less expensive, and we can choose the price based on the target market. And developers don't get the stuff they don't care about. The new products can be much slimmer, lighter, and easier to get started with.
Sounds like a plan ... :)
We now have a framework we can use to build debuggers for any runtime system. The actual way we do that for each language is not something we came up with originally: For both Ruby and Python, there are standard solutions in this space, such as protocols for talking between an IDE and a debugee. For Ruby, there's a project called ruby-debug, and there's something similar for Python as well. We plan to implement all those standard protocols, call stacks, break-points, watches, and so on. We also abstracted away a common UI framework for building debuggers.
But then I came across this!
There some people who hold the opinion that you should not be using a debugger: Instead of debugging you should write unit tests that cover every method individually so if something breaks, you don't need to debug, but get a failing test instead.
Who are these masochists?