- Development Project Issues
- Current Popular Desktop Database Development Platforms
- Some Simple Comparisons
- Update Notes (to November 2009)
Note: This White Paper was written in 1996. Since then the tools have improved and changed, but the arguments presented herein between the relative merits of In House development compared to purchasing a Commercial Application remain.
This paper examines the issues which arise when a business is considering in-house development of a mission critical software application when an off the shelf package is available. The conditions where this might arise are when the application replicates a common procedure.
- Does in-house application development make commercial sense?
- What are the inherent risks?
- What are the available tools?
- What other issues should be taken into consideration?
The purpose of this document is to provide a discussion of the issues and an objective comparison of the major database development tools currently available. The benefits and shortcomings of each tool is analysed in turn.
Development of a modern application is portrayed by the providers of development tools (such as Microsoft) as a simple process which can be undertaken by staff who are not professionally qualified programmers. Point and click simplicity is stressed, simple design of screens, easy visual design of reports, meaning application development without the need to write any code. They encourage the rapid development of in-house prototypes which can then be further developed to become the deployed application.
Almost all the vendors of such tools use this type of marketing to sell their products. This rosy picture is very appealing to the hard pressed manager who has pressing business reasons to automate an office or factory business process. However, in reality, very few business processes are truly this simple.
This report is restricted to the principal development tools currently available for developing database applications in a Windows environment which are representative of what is available.
For important specialist applications (defined as those where the consequences of failure will probably result in financial loss) the following statements for an in-house development are likely to be valid:
The probability of achieving the desired result is low.
The timescales for development and implementation are likely to be much longer than initially predicted. <
- The costs are unlikely to be acceptable, or anywhere near those budgeted.
- Maintenance is an uncapped drain on the IT Department's resources, whereas an annual contract with a supplier is fixed and constant upgrades are available.
- When a visual application development tool is used, the speed of operation is unlikely to be satisfactory to the operator.
- If a visual application development tool is used, the life of the application is unlikely to be very long as transactions increase beyond its ability.
- The actual solution finally provided will probably not achieve the savings in costs which made the project valid in the first instance.
The inescapable conclusion is that where existing products exist, they should be used unless it can be demonstrated that they CANNOT handle the majority of the business process. Suppliers of products are likely to have substantial experience not only of the business process but also of programming and product development. Where the business processes that are handled include that which is required as well as other processes, this is not a valid reason to say that the business process is not handled. If there is a substantial cost differential, then there may be a case. This is however unlikely.
This was confirmed by an article in the November 1998 edition of IT Consultant.
Copyright IT Consultant. Published by Hillgate Coomunications Ltd
Articles are published without responsibility on the part of the publisher or authors
for loss occasioned in any person acting or refraining from action as a result of any view expressed herein
Prospective vendors should be considered using the following criteria, which are listed in order of importance as defined by "Consultants Conspectus" magazine (10/95):
- Vendor's Long Term Vision & Strategy
- Vendor's Product Know How
- Vendor's Business Stability & Ability to Support
- Compatibility of Systems
- Ability to Tailor
- Initial Price and Annual Maintenance Cost
All of the above criteria are important, and when an in-house project is being considered, the same criteria should be applied to the development team that is assembled.
Business Stability equates to People Stability - i.e. what happens to the in house project team as a result of staff being promoted, transferred, or leaving. This is important as the following matters arise during the life of a product:
- The ability to correct coding errors
- The need to correct business process errors
- The ability to improve the business process as required
- The need to support users when things don't work
Cost of Failure
John Imlay, chairman of Dunn & Bradstreet, lectured Scottish Enterprise in August 1995 as to why companies should outsource their software development citing the following stages:
- during the in-house development phase:
- Wild Enthusiasm
- Diminished Expectations
- Abject failure
- during the later management phase
- Search for the Guilty
- Termination of the Innocent
- Promotion of the non-involved
The costs to a business of a failed IT project are usually enormous. They include not only the costs of development, but also the costs of disruption to the business element, which in itself may have substantial knock on effects affecting the company's main stream business. As an in-house IT project is rarely started before a need is identified, the development time (which is usually underestimated) involves a loss until the solution has been developed and implemented compared to a bought in solution.
Pre Project Costs
Industry experience, over a wide variety of development projects, shows that there is in no substitute for careful research, before any development (prototype or otherwise) is undertaken.
There are both direct and indirect costs involved in the typical project, therefore it is essential that the business case for an in-house development project be solid and fully costed.
Given the costs and risks involved in an internal IT project, it is essential that at an early stage in any in-house project, a full review of the commercially available packaged systems which address the defined business process requirements, should be conducted.
Consideration should be given to reducing the scope of a project when a "90%" (or even 70%) solution is found. This may well not only be the cheapest case, but also demonstrate that the missing ten per cent is either not reasonably achievable or unnecessary.
The case for an in-house project must show clear cost benefits or overwhelming business process benefits before ANY internal development project is embarked upon.
The political connotations of allowing the IT department to become indispensable due to reliance of the company on a "single vendor", or the IT departments wish to create jobs for itself always need to be carefully examined.
The following are common elements of direct cost:
- Prior to the commencement of the development, many matters must be sorted out. These include developing a precise specification of what the software is to achieve in business process terms, and setting a limit to what it is intended to achieve.
- The task of writing a specification is in itself not an easy task: the risk when this is carried out by a user is that vital requirements are omitted because they are normally carried out automatically, and thus not recognised.
- Many projects fail because they set out to achieve the unachievable. Considerable skill is required to define what can and cannot be reasonably achieved.
- Development Team
- Direct costs fairly obvious: the main component being the time used by the members of the development team. Even a small project will require the creation of a development team comprising a minimum of a Project Manager, and a Designer/Developer, and a User/Tester. It is important to designate at least one end user of the proposed system as a project team member.
- User Licences
- Other major direct costs include the cost of the development platform licences and the cost of licences for third party tools required by the development platform.
- Very often, it will be necessary to purchase a "run time" version of the development tool such as for Microsoft Access for each and every user.
The following are common elements of indirect cost:
- Management Time
- For a successful project the management team will have to invest a considerable amount of time in project review meetings between the project manager and end users.
- End User Time
- Disruption to the users working environment. During the project, the development team will often require to use end users time for questions about specific processes and for user feedback about system prototypes (the end users very often enjoy this involvement to the detriment of their function and productivity in your business.
- Deployment Costs
- Deployment of an in-house developed application can be expensive, networked applications almost always require some manual configuration and even single desktop applications can be awkward to deploy successfully.
- While many development platforms provide configuration tools for developing installation disk sets, what do you do when the end users discover that their word processor has stopped working on the Monday after the weekend your development team installed the new Order Processing system. Sadly this type of disaster can and does occur.
- Hardware Costs
- Often the in-house solution is developed on a PC with a specification considerably higher than the end users currently use. Major upgrades are therefore often required.
- Configuration Mgt
- Configuration management is related to the previous topic. It is the on-going process of ensuring that all the applications on an end-users computer continue to work together as new applications are added or are upgraded. This is part of the cost of ownership of a PC.
- Ensuring that an in-house application will continue to be useful for many years requires that you maintain a permanent in-house development team. The development team will have to regularly review the applications performance and provide regular fixes as new bugs are discovered (bugs will continue to be found long after the application has gone live, this is a fact of life and is not peculiar to in-house applications)
- Retaining the skills of a competent development team can be difficult. The members of a successful team will often want to move on to new and bigger challenges and it may be that your companies internal IT requirements do not offer sufficient challenge for such individuals. This will require the recruitment of new staff and significant training costs.
- Often during the settling in period when a new team member is getting to grips with an application you will find that the number of problems with a live production system will actually increase as a result of changes put in place by the new developer. This may happen even in highly documented systems even where the new team member is a highly experienced IT professional and simply results from the complexity of modern day applications.
IT projects have a very bad reputation for overrun. The costs and distraction from the primary goals of the business can be devastating for some companies. Not only do they fail to achieve the business benefits promised by a new system, but find that even more end user time is consumed by the development team as they go back to the end users to try and find out exactly where they went wrong.
Overrun occurs for many reasons. It may be found the overrun is acceptable if the application's structure is largely in place and appears correct and the only problems are cosmetic. Where overrun becomes fatal is when the development team have fundamentally misunderstood the business problem and then rush around trying to patch the system deficiencies in order to meet the revised project deadlines. This approach is rarely successful and is described as dealing with "A Can of Worms".
Fundamental design problems manifest themselves in several ways, but can usually be identified when end users report any of the following during Management Level Project review meetings.
- Appalling performance with production level data (rather that test data), as prototypes are rarely tested with realistic data.
- End users discovering that the system is "flaky" (causing random system crashes).
- End users becoming confused or overwhelmed by the system, as additional menu items are added to cope with deficiencies in the original specification.
- Practical use of the system requires more manual input effort than the manual system which was replaced.
- Incorrect processing (invalid business process or report results).
- Reports of Lost Data where the user is sure that data was entered and saved.
If it is discovered that a project has a fundamental design flaw then consideration should be given to abandoning the project at this stage as it is likely to be cheaper than the alternatives.
At all costs disallow the temptation to allow the development team to deliver a quick fix: quick fixes usually turn out to be long fixes with an interminable number of patches required over a long period of time to keep the system functional.
Any review of the system which includes a rethink or major structural change should be costed against the option of outsourcing a solution. Indeed the business case for an in-house development should have already reviewed and costed externally available solutions, and should have demonstrated the cost benefits of an in-house solution taking into consideration all of the cost factors detailed in this section.
The following steps are normally associated with carrying out a project:
- Feasibility Study: Research commercially available packages which address the problem. Review the business process thoroughly and produce a clear Operational Requirements specification. This is not a software specification.
- Tender: Request Proposals from external vendors based the Operational Requirements Specification. Get the in-house team to submit a Proposal under the same restrictions as the external vendors.
- Project Plan: The Project Plan should be detailed and complete whether produced by an external supplier or in-house team. Regular project reviews Choose and test the development tools carefully Make sure that the application has been properly sized (hardware requirements, required response times etc.) Plan early for the deployment phase.
- Allocate Resources: Staff the development team appropriately Be realistic about time-scales (time is an important resource) Involve the customer (the end user)
- Review: Don't get over-enthusiastic about the development team's early prototypes Use the prototypes only as tools to review the system requirements and performance. Set specific target dates for each of the deliverables. A deliverable should be a separate testable entity and will help you detect problems early in the project lifecycle.
Typical In-house Budget
The following represents a budget for a project to develop a fully featured registration system for recording the issue and receipt of technical documents to meet the requirements of ISO 9001 for use in the project management environment in the building and construction industry:
|Define Operational Requirements||Project Manager|
|Identify Development Tool||Project Manager |
|Develop & Test Software||Software Programmer|
|Document the development to enable later changes, error corrections, etc||Software Programmer||100|
|Write User Manual||Software Programmer|
|TOTAL||Bookable project days|
(at 200 days per year)
Typical Cost Comparison of In-house to Purchased System
The costs of procuring an available system is compared to the development cost indicated above.
|Purchase of System||£10,000|
(average salary £30,000 per year)
The recurring cost of maintenance and upgrades must also be considered:
|Support Contract (at 15%)||£1,500|
|In-house Support (3 days/month)|
(average salary £30,000 per year)
It should be remembered that even if the in-house support team have nothing else to do, their services must be retained which introduces a substantial additional cost. Alternatively, they may become so stretched with other things that they are unable to provide the support required at the critical moment, whereas the support contract will identify the response times.
Typical Simplified In-house System
The scale of the application that is actually required to fulfil the needs of the user is often simplified to the extent that although the process is replaced, the replacement provides the user with no real savings in effort.
The following represents a budget for a project to develop a simple registration system for recording the issue and receipt of technical documents:
|Define Operational Requirements||Project Manager |
|Identify Development Tool||Project Manager Software Programmer||1|
|Develop & Test Software||Software Programmer|
|Document development to enable later changes, error corrections, etc.||Software Programmer|
|Write User Manual||Software Programmer|
|TOTAL||Bookable project days. Allow 4 days per week:||32 days|
= 8 weeks
The cost of such a system would therefore only be £5,000. The same headings have been used as for costing a full application, so it is immediately evident where the major differences lie.
The data entry is unlikely to be as simple as with a well structured system. The more complex operations are likely to be quite tedious, and be based on simple point and click selections which are extremely time consuming, and prone to error when being repeated due to operator boredom.
Typical Cost Comparison of Simplified In-house to Purchased System
The costs of procuring an available system is compared to the development cost of the simplified system as indicated above. Again as the recurring cost of maintenance and upgrades must be considered, this has been added to show the total first year costs:
|Procurement Method||Cost||Maintenance||Total First Year Cost|
|Purchase of System||£10,000||£1,500||£11,500|
|Simplified In-house Development (average salary £30,000 per year)||£5,000||£6,000||£11,000|
This section deals with some of the technical issues related to the choice of development tools (platforms) for projects of various sizes.
Many projects are today undertaken without first correctly estimating the required capacity (i.e. number of transactions and supporting data tables, indexes, etc) and the required response times of the completed application.
It is also true that many vendors of development tools make extravagant claims for their particular product's capabilities. It is sadly most often the case that the limitations of a product only become obvious in the production application after it has bedded down, by which time it is far too late.
The question to be determined in advance of selecting a tool is
"What makes a good development tool ?"
In truth, virtually all of the current generation of tools have much to commend them. All are capable products dedicated to the job of producing competent reliable applications within their published constraints.
So what then is the problem, surely we could just go out and select the most popular tool and be confident that our developers would soon be delivering superb productivity enhancing applications. It's sadly not that simple; what we must ask ourselves in choosing a tool is the question:
"Is this the right tool for THIS particular job ?"
To answer this question we must ask many others, many of which involve highly technical issues which are often out of place in boardroom discussions.
The concept of scalability is an important one in IT and is related to the concept of application sizing.
- Sizing an application involves a close study of factors such as:
- The required transaction throughput for the application. The required amount of data storage
- The number of concurrent users.
- Required response times.
- Is it to be available over a Wide Area Network (WAN)
- On-line availability of the system (percentage of a day required up-time).
Scalability refers to the ability of a system to adapt easily to having any of the any of the above parameters re-specified in the light of changing business conditions.
For instance an application developed for a single Workgroup of 10 users may be able to operate comfortably with an average throughput of 100 transactions per hour however it may fail partly or completely when 10 new concurrent users are added to the Workgroup. A Scalable system however would be able to handle the new 10 users, perhaps by a simple upgrade of the network server or the addition of a new server to the network, with the applications data been shared over two servers.
It should be understood that issues of scalability are complex and that hardware solutions are not always the answer to these type of performance problems. Some platforms have inherent scalability problems built into the development tools or language.
Development Skills (building a project team)
One major aspect is the range and level of development and business skills available in the development team.
It is of little use to have a technically brilliant team put to work on a development project without at least one or two business wise members. Technical types often miss the purpose of the application, concentrating on the technology at the cost of the end user goal. This can result in a working system that only meets a fraction of the actual business needs of the end user, or that concentrates on one business area at the expense of the area that the users really care about.
Similarly it is just as dangerous to have a team composed of willing and enthusiastic technically naïve developers who are mainly members of the department which is the project's customer and who understand the business issues but have very limited appreciation of the limitations of the tools available to them (a very typical in-house development scenario). These developers will very often be seduced by the marketing claims made by the tools vendors and only later discover to their cost that database development it is a very complex subject. Such projects often require to be rescued by bringing in expensive external consultants.
The skills available and the cost of these skills (if skills need to be hired in) should be considered carefully during the projects feasibility study and when considering the development platform for the project.
Make no mistake, experienced database developers do not come cheap, and while the visual tools which are available today make some aspects of development easier they are no substitute for a solid understanding of Relational Database Design in practical applications (beware theorists).
It is also important to note that a developer can have considerable experience with several programming languages but can be an incompetent database developer. Database development skills, are not, and cannot be acquired, without the developer having experienced a real life database development project. Also a database developers specific skills acquired under one platform can sometimes act as a barrier to understanding and successful implementation of a system under another platform, developers used to XBase can sometimes be at a total loss when asked to work on an SQL based system.
The main point here is that one must carefully assess the accuracy of the team's development forecasts in the light of the team's general development experience, particularly if the team is new to the chosen development platform. You must also carefully assess the training costs which will be required, if your team has chosen to implement the project in a tool which is unfamiliar to them.
Note you cannot escape this cost: you either pay for formal training for the development team or you let them learn the new skills required ‘on the job', needless to say the latter option can be very expensive indeed (even when you pay for formal training you will still incur an ‘on the job' training cost as it always takes a developer time to get up to speed on any new platform).
Modern Desktop Hardware Requirements
It is extremely important to test the claims of database development tool vendors on the actual hardware standard proposed for final deployment of your application.
One of the most common reasons for failure of projects is the choice of the wrong development tool for the size of project. Note that some tools are physically appropriate for any size of development [C and C++ come to mind]; however they may not be economically efficient other than for very large projects with a large development team, or for commercial application developers who have the ability to make the large investment in development costs necessary to get a product to market .
The factors, under this category, affecting the choice of development tool are largely physical application size (hard disk space required for the executables) this affects the load up times for the end users application and the system memory requirements required for acceptable response times as perceived by the end user.
For the most part, the majority of modern database development systems require well specified client end PC's the current MINIMUM reasonable specification for most of these systems is:
- Intel 486 or compatible processor
- 540MB Hard drive (for operating system and applications)
- 16MB of RAM 1
- 15 inch Super VGA monitor
The requirement for a 15 inch Super VGA monitor arises from the need to provide for comfortable display at the 800x600 display mode which has now superseded the old 640x480 standard.
The current crop of popular desktop database application development systems can be broadly split into the following categories:
- XBase Derivatives
- SQL based systems such as Microsoft Access
- Rapid Application Systems
- Client/Server database systems
"XBase" is a collective term for development platforms which use the open standard DBase DBF file format which in today's world is viewed as a high performance legacy file format.
Some people also take the term XBase as a reference to languages based upon the original Ashton Tate dBase III language. However this is not a fair reference as few of the systems which are XBase derived bear much resemblance to dBase. In addition many systems now provide access to data held in XBase files. We use the term XBase to refer to any system making use of the file and various indexing formats.
Pure Structured Query Language (SQL) desktop systems for the PC are currently thin on the ground. However many of the development platforms discussed here provide an implementation of SQL. Microsoft's Access is the main desktop database product in this market.
True Client/Server systems are the real home of the SQL language, these tend to be much more expensive than desktop systems mainly as a result of per seat licence fees. However they can also provide more scalable and robust solutions.
An additional important consideration is that of development languages which do not in themselves have their own native database format but which exploit one or more of the main industry standard formats. With the rise of ODBC (open database connectivity) standard these tools have gained significant grounds in traditional database markets.
Development Languages with Data Manipulation Capabilities
In the last few years the distinction between dedicated database development packages (such as Access) and straight programming languages (such as Pascal) has become ever more blurred. More and more programming languages are acquiring database development capabilities. The combination of capable compilers and database support has provided developers with ever more flexibility in their projects.
While this offers great flexibility to developers it also places new demands on them. Native programming languages are usually more difficult to use and require a generally higher level of skill for successful results. Broadly speaking there are two classes of Native languages providing database support:
- Languages which acquired database add-ons
- New generation languages designed around database support but with full general purpose programming flexibility built in - known as RAD's.
Microsoft's Visual Basic and Visual C++ fall into the first category, with Borland's Delphi and CA-Visual Objects falling into the second.
As with all XBase solutions a well designed database structure combined with a well thought out application will outperform almost all other desktop database applications.
This is largely a result of the maturity of the tools available in the XBase market. These tools were designed for an age where PC processing power was limited and managed to eke out every last bit of performance from there languages and data formats. With the performance available on a modern P133 Intel processor based PC the XBase development tools really shine.
One should also remember that the majority of XBase systems use what is known as a P-Code compilation technique (pseudo code) which creates executable applications which are not internally as efficient as for example a C, C++, or Pascal application.
There are a couple of additional points to remember about XBase systems.
- They are not targeted at the End User but rather at the Developer
- The skills required for successful application development are considerably higher than for "point and click" systems.
Microsoft's Visual FoxPro
MS Visual FoxPro is an extremely competent development tool which is the native environment of the best of the current XBase index formats (CDX). Indeed while it is not an official Microsoft position, it is true to say that Visual FoxPro is Microsoft's heavyweight desktop application development product while MS Access is positioned more as an their end user tool.
While Visual FoxPro applications are in principle the fastest performers on the market in terms of raw data manipulation you will require a highly specified PC and a skilled developer to get the best out of this environment. It is also important to note that there are considerable questions over Microsoft's future commitment to the product.
However in-spite of these improvements one should not forget that the provision of new facilities (in the case of Visual FoxPro a large and complex development environment) consumes some of the benefit of the modern range of processors.
The major issues which should be born in mind when considering Visual FoxPro for a development project are:
- Number of users (above 20 on a local area network will start to cause network traffic problems)
- The deployment platform (PC capability Pentium class processors with lots of memory are really required for workable results)
- Skill level of development team (XBase requires highly skilled developers for reasonable results)
- Future direction and support (will Microsoft continue with the FoxPro line or is it a already a legacy system?)
In summary Visual FoxPro is suitable for all types of desktop development project but requires modern high performance hardware and a skilled development team.
Borland Visual dBase
Visual dBase is the direct descendant of the original dBase II and is now in the hands of Borland.
There is very little to choose between the Microsoft product and the Borland product both are competent performers with similar strengths and weaknesses. Overall the FoxPro product is a better performer while Visual DBase implements a slightly more elegant dialect of the XBase language. Again a highly specified PC is required to get acceptable performance from applications produced by this development tool.
Similarly there is also a question mark over the future of the Borland product both internally at Borland and as a result of the financial instability of the company.
The major issues which should be born in mind when considering Visual dBase for a development project are the same as those for Visual FoxPro
CA-Clipper is the discerning professional developers favourite tool in the XBase stable. While restricted to a character based user interface it creates small fast applications, which can be run in a terminal window under Microsoft windows. The Clipper language is a superset of all the XBase dialects, and is considered by most developers as the one of the most elegant development languages ever developed.
The performance of CA-Clipper applications is equal to that of Visual FoxPro and often superior as a result of the lower overheads incurred under the character based terminal interface.
CA-Clipper like Visual FoxPro uses a P-Code compiler to create executable applications, however it has the most advanced implementation of a P-Code compiler on the market today. The quality of the compiler combined with the underlying database technologies used by CA-Clipper result in the fastest code execution available, this is particularly noticeable in any application which has to deal with large volumes of data processing.
The quality and breadth of the language and its ancillary third party tools provides the experienced developer with the ideal development tool. There are few if any problems which could not be modelled and solved by a skilled Clipper developer this is the reason that it the development tool of choice for many small and large independent software houses.
While many XBase systems have limits imposed largely by the structure of the development tool or the language, for example the control of concurrency in network applications (a weak point of the FoxPro implementations), the open design and architecture of CA-Clipper provides the developer with very fine grained control of such issues.
Another strong point is CA-Clippers Replaceable Database Driver (RDD) architecture this allows native access to many other database file formats. One of the most recent developments in the Clipper RDD arena has been the creation of a new RDD for the IBM AS400 which has tremendously increased the scope of the CA-Clipper product. The CA-Clipper RDD architecture is similar to Microsoft's ODBC system however it provides superior performance as a result of its native support for these additional file formats. RDD's for both SQL and ODBC are of course available.
So what are the drawbacks? The main problem is CA-Clipper's out of date character based user interface. While this is fully customisable by Clipper developers, and many novel user interfaces have been designed for CA-Clipper applications this cuts little ice in the modern graphical world, where users now expect modern looking graphical user interfaces. The second main drawback is that the development of good high quality CA-Clipper applications requires highly skilled developers.
The user interface issues and future direction of the CA-Clipper language have been addressed by CA in the creation of their new generation of Windows development tool technology CA-Visual Objects which we discuss further later in this report.
In summary CA-Clipper has a long history of producing solid reliable database applications and has been in use world-wide for the last 10 years or more. It is still a superb development system which is still in wide use and is still well supported by CA (the second largest software company in the world) but is now limited by its lack of a graphical user interface.
Other Microsoft Systems
Microsoft Access is an extremely competent development product which is quite unique in today's desktop database market.
MS Access is positioned by Microsoft as an easy to use end-user development product with superb support for point and click form and query design. In addition it comes complete with a fully featured development language of its own which is in essence a dialect of MS visual Basic.
Note that currently the MS Access Basic development language is really too difficult for the average end user to get to grips with. This is slightly at odds with the products user friendly end-user development tool image. However this does not detract from the overall power of the product.
There are a number of major issues which should be born in mind when considering MS Access for a development project: these are discussed in the following paragraphs.
MS Access uses a proprietary database format. As it is unlikely Microsoft will publish it in the near future this means that few other products (other than Microsoft's) can use data held in MS Access Tables other than through the ODBC driver interface (and it should be noted that this method of access to data does not support many of the special features that are built into MS Access).
While the product is, on the surface, very easy to use and provides superb reporting features it is not in itself capable of producing stand-alone applications. The system requires a full copy of MS Access in order to be able to run the deployed application, there is however a developers kit available for purchase that provides a crippled copy of the Access run-time engine which can be distributed free of licence fees.
At this point in its history MS Access suffers from the following two major problems.
From the very first version it has proved extremely demanding on hardware resources: version 1 and version 2 were next to unusable on anything less than a 486x33 PC with 8MB of RAM (Microsoft recommended a minimum of a 386 with 6MB of RAM but this was always unrealistic). The RAM requirements increased sharply as the size and complexity of the database and or application increased.
This situation has worsened with the release of the 32bit version 7 for Windows 95. Indeed the release of Microsoft Office Professional for Windows 95 went ahead without MS Access 7 in the box. Instead a voucher for MS Access 7 and a copy of MS Access 2 were included in the box. On the eventual release of version 7 Microsoft had to retract its claims that the system would run in 8MB of RAM and offered a refund of part of the cost of Office Professional to those who had the voucher for MS Access 7. The end result is that minimum realistic requirements for MS Access 7 are now set by Microsoft at 486x66 with 12 MB of RAM. Developers are reporting that for deployment of workable applications a system with a minimum of 24MB of RAM and a P100 Pentium is a more realistic option.
The second problem is more subtle. The ease of use of Access is such that it encourages less skilled developers to tackle large and complex development projects for which the system is not really designed (hardware will catch up eventually in the same way that hardware improvements have so improved the performance of all XBase based applications). This leads to the development of promising early prototypes using small amounts of data which do not reveal the problems that will be experienced when the application is scaled up to production sizes. Indeed developers have reported serious performance problems with even small systems with less than 5,000 records.
The principle problems relate to data manipulation in other words the speed at which new records can be appended, deleted and updated. For even small to medium sized databases the append time stretches considerably as the number of records increases, this makes it a very poor choice for transaction intensive or on-line applications such as customer order enquiry systems.
We have a simple example of this problem. For a simple one field update and append of a single unrelated table in a program loop running from 1 to 100000 (inserting 100000 records). The insert took 15 minutes on a P166 Pentium class processor with 48MB of RAM using Access version 2.0. A similar test using the CA-Clipper XBase system required approximately 1 minute 45 seconds and the same test using FoxPro took 3 minutes.
The combination of heavy resource requirements and poor data manipulation result in very poor scalability of MS Access applications. As such it should be accepted at this stage in its development, that MS Access is really only suitable for small workgroup applications which are not mission critical, do not involve large transaction volumes or large quantities of data.
It would however be fair to say that MS Access is suitable for the feasibility study phase of a development project, it is a useful tool for the creation of prototype applications, Indeed MS Access is particularly appropriate for the development of Client server Prototypes as its support for SQL is very comprehensive. However one should always remember that the prototype should never go on to form the core of the production application.
Microsoft Visual Basic has become one of the most popular development environment's today.
It is an interpreted P-Code language, meaning in effect that it executes code slower than it's fully compiled (native mode compilers) competitors such as C++, Visual Objects and Delphi.
The great popularity of Visual Basic with developers is largely a result of its component based approach. Visual basic is designed to make use of plug in components (VBX's or Visual Basic controls), these components are normally written by third party developers in C or C++ and perform tasks that would normally be beyond the competence of the average Visual Basic developer.
The type and range of tasks that these VBX controls perform is enormous. VBX's are available from third party vendors which provide fully featured communications terminals, plug in spreadsheets, word processors, data grids and much more. However the use of a great many VBX controls in a single application will incur a major performance hit at run-time. Some developers tend to go overboard with VBX controls and end up with 10 or more controls in a single application some of which are large components like spreadsheets etc. The VBX is basically a special type of dynamic link library specific to Visual Basic, the use of a great many components can result in slow load times and the need for the operating system to swap VBX controls in and out of memory, when swapping starts to occur in any application this can seriously affect the users perception of the applications responsiveness.
Visual Basic provides an excellent development environment which is very easy to use and very flexible. It is important to note that this ease of use does not automatically mean that developers new to Visual Basic will correctly structure their applications and use the best programming techniques for Windows development. It is a myth that code free development can be achieved for anything other than trivial applications (indeed this is true of most available development systems) there is still no substitute for a skilled and competent development team and the learning curve is still quite steep (it takes roughly six months for a developer skilled in DOS programming to get up to speed with VB).
The popularity of Visual Basic's development style (its general ease of use) has inevitably led to the use of VB for database applications. By default VB makes use of the MS Access ‘Jet Database Engine' and as a result it suffers from exactly the same performance problems that are apparent in MS Access itself, thus making it only suitable for small to medium sized desktop database applications. It should be mentioned that VB is however very good for developing front ends for Client/server databases such as Oracle or MS SQL Server as the MS Access Jet engine is not constrained by its local performance problems.
There are numerous third party database libraries available for Visual Basic which avoid the requirement to use Jet for data access. These may provide a better approach when considering Visual Basic for a large application, however nothing will get round VB's inherent slow execution of code.
MS Visual C++, is the professional developers choice for MS Windows development. Application development with C++ is not easy and is not rapid compared to other tools however the resulting applications are the fastest and most efficient you will get.
Visual C++ is not normally seen as a natural choice for database applications as a result of this lack of rapid development, however it's built in support for database applications is improving year by year and it is the only Native Mode compiler with direct support for the MS Access Jet 3 database engine.
The skill level required for successful application development is very high and it is usually difficult and expensive to hang onto C++ developers.
Rapid Application Development Systems
Borland Delphi is one of a new breed of RAD (rapid application development systems) which are appearing on the market today. It is aimed at general purpose application development tasks with the recognition that many programming tasks now require database support.
Delphi has a fast native mode compiler and has its own component approach called VCL (Visual component library). Delphi contains a very broad range of user interface tools and has an extremely good development environment, the resulting applications are fast and are reasonably small. Its database support is quite broad and it has a strong third party market providing a wide range of VCL controls and third party database support.
The performance of Delphi applications is excellent and development is relatively easy. The underlying language is a dialect of Pascal which was the main teaching language in most UK universities and colleges which means that most developers will have some familiarity with it.
It is hard to think of many drawbacks, some Delphi developers report that it is not an ideal language for expressing the business logic inherent in database applications others have reported some limitations in the data manipulation features, however the majority report that they are very happy with Delphi and are achieving good results.
CA Visual Objects (VO) is another of the new breed of RAD systems, it is a direct descendant of the CA-Clipper language and is in effect a superset of CA-Clipper and XBase.
VO is a dedicated object oriented database programming system with an incredibly rich programming language. It is not really suitable for development of very small utility programs, but is a superb development environment for small medium and large database applications.
VO like C++ is the professional developers choice, it is not an easy language and it lacks many of the user interface design tools found in the more popular languages like VB and Delphi, however it more than makes up for this in having the most versatile data manipulation language available in any environment today.
VO is indeed best compared to C++, it provides a better development environment and easier and more powerful language. The resulting applications are as fast as anything C++ can provide and are more easily maintained and extended.
It has superb connectivity allowing manipulation of data in almost all desktop database formats and provides ODBC connectivity to a wide range of mainframe and client/server databases.
VO's steep learning curve means that it is not really suitable for the small in-house development team unless the team has a lot of CA-Clipper experience already and wants to move to the ideal Windows environment for the Clipper developer.
Client/Server database solutions are not cheap, they are also not easy and represent a major challenge to the average development team. The main advantages are scalability and data handling capacity, greater security of data and reduction of network traffic.
Client server application development requires considerable planning and very careful management. Skills will normally have to be hired in to support the project and the on-going maintenance costs are high as a full time database administrator (DBA) will normally be required to tune and maintain the server database, in addition the host server will normally be running one of the many flavours of UNIX thus requiring a skilled UNIX administrator as well.
Many client/server databases have their own proprietary development languages which are very unlike their desktop cousins. Data manipulation is universally through SQL which provides very powerful features however developers often have to resort to procedural code embedded in the client end application. Many of the modern desktop databases can connect to and manipulate the data of these client/server systems.
While rapid development of robust applications has long been the goal of client server system vendors it is often not achieved and there has been a long history of failed client/server projects where developers struggled with the new level of skill required and the new issues raised by client server development.
In summary, while client server is an elegant and effective solution it is definitely still in the future for anything other than very large systems, such as are run by the major banks, where the cost can be justified.
The following tables provide both subjective and objective metrics comparing the various tools discussed in this document. The first table provides what must necessarily be a subjective comparison of the difficulty that would be experienced by a developer in attaining competence in each of the development tools detailed in this report.
|Development Tool||Execution Type||Comparative Difficulty (0-100)||Months to achieve|
|MS Visual Basic||Interpretative||60||6||18|
|MS Visual C++||Native-Code||100||9||36|
The second table shows typical application speed ratings for the different types of execution type. This information was presented in the US at an industry Technical Conference.
|Execution Type||Application Speed|
The next table shows the results of a simple test which highlights the underlying basic performance of each database engine. The test consists of inserting a number of single field records. As C++ does not have a specific database engine, it cannot be tested comparatively.
It should be remembered that database engine tests do NOT reflect the speed at which the application runs, which is also dependent on how much processing is done, and the table above is indicative of speeds for this. Within each group, some tools will go faster under certain conditions and some slower.
The database engines that were tested are the "default" ones. As all of the tools can support more than one database engine, the speeds are not necessarily the fastest that can be achieved. In particular Clipper is not generally used with its default, but with the same engine as used by FoxPro.
The FoxPro test was conducted disabling the reports which are made to the screen, as these are known to slow the system down substantially.
|20,000 records||100,000 records|
|MS Visual Basic||0:48||3:06||16:40|
The speed of database engines can vary enormously depending on which tools is utilising it, and the following table gives some idea of this. There a number of things that have to be considered when choosing a database engine including:
- Inserting records
- Scanning through the database
- Seeking a record using an index mechanism
- Updating the record, i.e. replacing the information.
The range of times for each tool over a range of database engines varies by a factor of ranging from five to five hundred. However in all cases the DBFCDX driver was either fastest or second. This underlines the importance of choosing the right tool for the data environment and evaluating it with REALISTIC data quantities.
The last table shows the "competence" of the language for creating applications, where processing of data has to be carried out:
|Development Tool||Language||Flexibility||Form Design||Report Writer||OOPS|
|MS Visual Basic||Fair||Good||Excellent||Fair||Pseudo||Limited|
|MS Visual C++||Excellent||Excellent||Fair||Fair|
Finally two graphs are shown to indicate the performance differences between Microsoft's Access product and Computer Associates Visual Objects in terms of complexity of application, and the number of records being handled.
It should be stressed that these graphs are in part based on the subjective assessments given in the above tables. What they do not indicate is the amount of time that would be expended on development.
Microsoft's Access product is extremely easy to use to create simple applications which require to handle less than 1000 records. Also the speed with which a simple application handling less than 1000 records will operate is satisfactory.
Visual Objects has a high learning curve, but is then becomes an extremely fast and powerful development platform because it is object orientated and has excellent inheritance features. Degradation due to complexity of application and numbers of records are barely noticeable.
It is recommended for development of business critical applications that a modern RAD such as CA-Visual Objects which generates native code is used as opposed to an Interpretative SQL system such as Microsoft's Access.
UPDATE: July 1998
Since December 1996, all the products reviewed have been upgraded. A further review of the latest releases indicates that the relative findings of the report remain valid. It is possible that Delphi 4 (released July 1998) will be as good as CA-VO2 Release 2.5 (still in beta, due for release December 1998). Microsoft's Access and Visual Basic have improved enormously, but are still directed at the smaller end of the in-house software market.
UPDATE: October 1999
TDOC Projects Ltd have now acquired operational experience of products which they have developed in Delphi 4. The main deficiencies are in the speed of data management. The Borland "BDE" is no match for the CA "RDD" technology. CA-VO2.5 has been recently released and promises to be considerably superior to Delphi 5. TDOC Projects Ltd have therefore decided to continue product development in CAVO2.5 using the Advantage Database Server.
UPDATE: June 2006
TDOC has now moved to VO2.7 Build 2740, the fastest development environment imaginable, and probably most mature. The extent of the Development Libraries available is unmatched for the WinTel platform. These include TabControlPlus, ReportPro, bBrowser, and SO's Internet library to name a few.
The Advantage Client Server Engine on which TDOC is based outperforms all other similar engines. In seven years of use not one single support call has been received from users.
UPDATE: November 2007
TDOC has now moved to VO2.8 Build 2822. It is expected that TDOC will next move to the "DOT NET" environment using the Vulcan Development Language, which in provisional tests has indicated its superiority over other development languages within the same environment.
TDOCwas nominated for both the Product of the Year and Document & Content Management Software of the Year in the 2007 Construction Industry IT awards held in November 2007
UPDATE: November 2009
TDOC has now moved to VO2.8 Build 2830. Plans to move TDOC to "DOT NET" are in place. Although Vulcan is virtually ready, some vital components remain to be transported. However, there is no user demand for the move. TDOC now runs "over the internet" directly. As processing is split between the client server engine and the user, it is considerably faster than competitive products using a web browser interface.
Many of the languages which have veen compared are available in the DOT NET environment. However, the comparisons of speed of operation, and the reliability of development remain valid.