In his post today, Marshall Fryman discsusses problems with shared development environment or workgroup development. Yes, it could be a complex problem which is rarelly addressed.

Note. Even though we are talking about Delphi here, it is really could be applied to any development environment.

Let look at the problem closer.


You have development environment with over dozen of development stations. You have similar development environment: versions of development tools used, locations, components. How to do easy update/upgrade in this environment and keep it up-to-date?


1. Virtualization

My vote goes for virtualization – choose from VMWare, Microsoft VirtualPC or other solutions.
Then create base image which everyone would share, change your settings that anything which is user specific goes to his/her local drive instead of virtual image space. Yes, you would have to have some sort of unification, but it is true for any development environment nowadays.

If you go with Server editions of such products then it could be even easier since some of them could allow you maintain your own sessions without affecting main image.

Then make it through your virtual environment has a priority over your hardware so if you are in the development mode you would not really suffer penalty of virtualization. With new versions of Windows coming, I think it would become common and internally supported by default. For now you would have to do some maintenance.

2. “Virtualization”

The same name, but a way to achieve it is different. Instead of virtualizing a whole Windows environment, you do that for your Delphi environment only.

What does Delphi environment consists of?

  • Delphi installation. We have to assume that everyone is on the same version. Best to have it in the same location, but it is not really necessary.
  • Project source code. If you use any change management system, then you most likely already have everything in one place, and in the same folder structure. Projects are open from similar locations and exe/dcu files created in the same designated areas.
  • Components, wizards. That could be a problem and this is where you really put “virtualization” in place.

Let’s think about it for a second. What does Delphi have to have to properly use custom components?

BPL, DCP, DCU, DFM/RES files location! IDE really does not care where code was compiled, but rather that components are properly registered and place to find all required files.

Keeping that in mind, let’s create proper infrastructure:

  1. Put everything in your source control system and have it in the same places on all machines. You could have BPL/DCP in the same folder with DCU files or in different locations. Also folder with DCU files should include all necessary DFM/RES files.
  2. In Delphi, add your new location for DCU to “Library Path” in IDE options, and your BPL/DCP location to “BPL/DCP output directory“.
  3. Do not add source location to a “Browsing path“, you do not want THIS shared code to be recompiled on individual machines. It will also prevent some “strange” errors when DCUs start to “walk around”.
  4. Then install all necessary Design time packages and you done.

How would upgrade process work?

Now it is easy. On your master computer update components, recompile if necessary, then check them into your source control system. Then, when necessary, before launching Delphi IDE your developers would get latest form depository, which will update all components and proper DCU/BPL/DCP.

It is rare when new components has to be reinstalled in Delphi since file/BPL structure is usually remain the same. Even when such maintenance is required it can be accomplished with simple .REG file deployed and applied to all machines. But it is rare situation.

Result – you now have fully maintainable, distributed development infrastructure.


Both scenario are result of my own experience in resolving an issue of handling workgroup environment and have been used in real life.

1 Comment

Marshall Fryman · Aug 12, 2008 at 17:23

Serge –

I agree either option would work. In fact, we finally settled on your second “virtualization” technique as a semi-standard deployment method. I really find either virtualization methods to be seriously lacking in ability. I’m not sure why development environments don’t take into account that the dev environment is just as important as the code that is developed. I no more want to blindly install SP 999 from CodeGear than I want to accept code from a fellow programmer without the ability to “undo” the changes.

One of the problems is that the development environments for Windows typically use the registry for everything (which, IMHO, is a terrible invention) forcing you to manage the registry AND the dev environment AND your code. Are there add-ons that let you preserve the registry? Sure. One could even argue that you could use a system restore point to manage the process. The problem with either solution is that they both assume you have a “dead on arrival” (or install in this case) problem. When you have tens or hundreds of projects that all have different life cycles, you may not discover a problem until months have passed. Once you have a problem, you are then stuck trying to integrate a long period of changes with a restore point of some type.

I’m sure someone is going to come along and say, “You should always test everything when you install a new version.” In fact, I do agree with that. Unfortunately, I’ve discovered that a test compile or even unit tests do not reveal all the subtle changes that can occur. For instance, the new unicode change in D2009 is likely going to show lots of funny little changes that only a full QA sweep will detect (hopefully). I know when we went from string as a shortstring to string as a long string, I fought with it for years. If I had to do a full QA for every program I have, for every dev environment upgrade, it would mean I could never upgrade.

At any rate, I feel the dev environment should “stream” back and forth in time just like the code we develop does with a source control package. In fact, I suspect a good streaming system would mean people would upgrade more frequently resulting in more revenue for CG. As it is, we waited from D7 to D2007 before upgrading. That kind of cycle can’t be good for the bottom line.


Leave a Reply

%d bloggers like this: