Major PC Problems Expected In Y2k : LUSENET : TimeBomb 2000 (Y2000) : One Thread

The following presentation was made to the Royal Society in England on October 5, 1999. The author, Bruce Parker, has predicted major PC and desktop shutdowns after Y2k. To download a FREE AUDIT to see if you have short date issues on your PC that will cause you problems, you can go to: or

Most of the attention for Y2k problems has focused on "mission critical system" mainframes and embedded systems failures. Would non-functioning PCs contribute to the domino effect? Here's a Time Bomb 2000 string with additional information:

Insert> The Royal Society October 5th 1999

Computer Heresy - Is Source Code Necessary?

A Presentation by MFX Research Pty Ltd

Given by Mr Bruce Parker - Director of Research & Development & Dr Michael Carr - Managing Director


This document and the MFX software described within it are copyrighted with all rights reserved. Under copyright laws, this document may not be copied, photocopied, reproduced, translated or reduced to any electronic medium or machine readable form, in whole or in part, without the prior written consent of MFX Research Pty Ltd. (MFX). Failure to comply with this condition may result in prosecution.


The information contained in this document is intended to be used as a guide to some of the more complex Year 2000 (Y2K) issues. In no event will MFX be liable for direct, or indirect, special, incidental or consequential damages arising out of the use, or inability to use, the information contained herein. MFX reserves the right to correct update or modify this document without notice and without incurring liability.


Throughout this document, references are made to other manufacturers' proprietary products. All Trademarks are acknowledged.

Copyright:  1999 MFX Research Pty Ltd.

MFX Research Pty Ltd. Level 1 19 - 23 Bridge Street Pymble NSW 2073 Australia

Telephone: +612 9440 0200 Fax: +612 9440 0033 EMAIL: WEBSITE:


Dr. Michael Carr, - Managing Director - M.B; B.S; F.R.A.C.R; F.A.N.Z.C.A.

Michael Carr is a founding director of MFX Research and was appointed Managing Director in March 1999.

Michael is a Medical Specialist who has administrative expertise and was the managing partner of a large medical business for the 10 years prior to his involvement in MFX Research. His interest in computer technology resulted in his meeting and working with Bruce Parker on a telemetry program in 1997.

Michael and Bruce co-founded MFX Research in September 1997.

Mr. Bruce Parker, - Director of Research & Development.

Bruce studied Computer Science at Sydney University. In 1989 he developed a litigation computer support program and implemented the use of Computers for hearings within the Supreme Court of Australia. In 1994, he developed the Romulus Software program, which is used in the management of litigation matters in Australia.

In 1995, Bruce won the Australian Business Software Award for the development of RamGate, a Windows stabilisation program, which is now sold by IMSI, ForeFront and RMG. In 1996 he developed the back end for Wang's Open Workflow suite.

Bruce co-founded with Michael Carr, MFX Research and won the 1997Australian Business Software of the Year Award for developing MFX 2000.


1 Abstract

MFX Research has developed technology, called the DaM  Detect and Modify algorithms that represent a significant scientific breakthrough.

While it has been previously accepted that computer binary code could be manually changed, this was confined to a hit and miss approach, targeting single files.

MFX Research has developed a fully automated process that allows any file or group of files to be entered and modified, ensuring that:

 File functionality is preserved  Internal calculations are correct  The same check sum is maintained

The significance and applications of the technology will be discussed with special reference to its use in the evaluation and correction of Y2K issues.

There will be demonstrations of the DaM technology and of its use in software tools to diagnose and fix Y2K errors in computer operating systems, applications and data files. 2 It has always been possible to modify files without source code

As those of you who are involved in the computer industry will know, it has been possible and of course still is possible to modify compiled computer files without the need to revert to source code.

This may be done using simple binary or hexadecimal editors to change a given function or step within a compiled file.

The definition of compiled file for the purposes of today's presentation is a file that has been designed to provide a set output from information entered by a user. To achieve this functionality a design or functionality specification is initially required, from which, a computer programmer will create, in a text format, a set of instructions, which details the interface and the actual calculations to be performed.

This text format structure will vary dependant on the development language being used. Once the source code is completed, it is translated into a structure, which the computer, on which it is being implemented, can understand. In other words it is compiled. Whilst the original source code can be read by someone expert in the understanding of the source code, once it is compiled, it takes on a totally different format.

Once a file has been compiled, the identification of specific functions becomes extremely difficult, due to the inter-relationships within the file itself. To ensure the integrity of a file, a function known as a checksum is included. In earlier operating systems this check sum was the total of the Hexadecimal values of each of the characters within the file. Today, it is more common to find that the checksum is just a simple total of the number of bytes within a file.

This functionality can best be seen with virus scanners. At present there are two major types of virus scanners - those that look for a virus signature and those that are known as heuristic in that they are looking for any changes in an executable files file size and/or last date/time the file was modified. These virus scanners only tend to perform full hexadecimal value counts on their own files due to the amount of time it takes to scan a file.

Various tools are currently available to allow you to view and edit compiled or object code, they are more commonly known as Hex Editors. I use Xtree, Ztree and HexEdit. These tools not only allow you to see the hexadecimal code of a compiled file, but they also allow you to edit and save any modifications.

With tools of this sort it is possible, within limited guidelines, to change the functionality of a file. The ability to do this and the guidelines are unfortunately not made known to most new programming graduates these days, as most of their time is spent learning all the various compilers, networks, infrastructures etc of the modern computer environment. Indeed I have lost count of the number of IT professionals under the age of 35, who tell me that you cannot change an executable file, as it will be corrupted. 3 The rules are simple

The rules are simple. Make sure that:

 You keep within individual procedures, and that the procedures are still functional.

 You preserve the file date / time stamp

 The size of the file remains exactly the same.

Given these parameters and the average sizes of today's compiled files, there is plenty of room to modify files.

Most modifications to files at this level occur to correct an annoying typing mistake on a button or within a message box, removing a splash screen or for what is more commonly known as Hacking.

Changing the characteristics of a file is fairly simple as the files name and the message boxes are in text format and at hexadecimal level are represented by their relevant character codes. In effect you can switch to a binary view of a file and actually see the text displayed. For example, if a file is compiled to operate on Windows '98, and is to fulfil the function of a basic Word Processor, we can look for My word processor which is the name displayed by the parent window of the file at the hex/binary level, and manually change this to PC Word document. However, three things are required to ensure that the file will still operate.

With such a substitution, 17 characters have been replaced with 16 characters. This means that not only is the byte size of the file 1 less, but all procedural addresses within the file have been moved by 1 byte. This is enough to make the file inoperative. The easiest way to correct this is to add a space at the end of the new title so that the original and the new title are both 17 characters long. This addresses the file size and the byte size/checksum.

Next the file needs to be re-stamped with its original last modified date time stamp. Again, this can be done manually using the tools I have previously mentioned.

Thirdly, to be absolutely sure, the total value of the hexadecimal characters must be maintained. My word processor totals 6A2 and PC Word document totals 5CE, which means that we have a deficit of D4. The easiest way to correct this is to add D4 to byte 770. If the surplus were D4, a quick scan of the file looking for some further text structures would be performed, with a change of the required number of lower case characters to uppercase characters. Each change would provide an additional 20 deficit. In this case 7 characters would need to be changed to uppercase and 0C would be inserted at byte 770.

The above 3 steps guarantee, at this simplest level that the file will not suffer from any form of error and there is no way of detecting the modification to the file, other than the visual change to the files parent window.

If an understanding of a specific file exists, then more than simple text may be changed. In the hacking world, the bypassing of software locking mechanisms and splash screens etc may be achieved, usually by trial and error, rather than by any set method.

Whilst modifications are usually made at the hex/binary level, it also possible to make changes at the Assembly level, but at this level a trail is left, in that there will be a change in the checksum of a file from its original compiled size. It is much easier to ensure the checksum integrity of a file at a hex level.

Having established that it is possible to modify a compiled file manually, we need to explore where this may lead.

In todays environments with the use of operating systems like DOS, Windows, Linux etc, applications are created from a number of files, with the main executable file being the centralised hub for the functionality delivered by each of these associated files. This means that manual modification of a single file will have ramifications across some or all of the associated files.

Returning to the simple Windows example. If we introduce a spreadsheet application which is named My Spreadsheet Application, and embed a spreadsheet from this application into My word processor. If the application's name is changed to My Spreadsheet Tool, then, whilst the application itself may operate correctly the Word Processor has an embedded file, which was supplied by My Spreadsheet Application. For this embedding to occur, a handle is required, which will usually be the application's name and the internal name of the application. This internal name is different to that of the Windows name. With the variance introduced, the Word Processor will not be able to display the spreadsheet, as it can no longer find all of the required parameters.

This means that what is a simple modification in a single file, now requires modifications to both applications. This may be restricted to just the two compiled files, but more likely other external files will become involved and will also require modification.

4 Modern Operating Systems & complex interconnections

Due to the interaction between files and the various functions supplied by library files for functions like spell checking, charting, and other functional controls, we move to a point where a single application executable file will be supported by up to 50 to 100 individual Dynamic Linked Library (DLL) files. With the advent of object based software development the usage of files to supply controls like Tabs and rotating buttons as well as additional functionality for the internet, we have now moved to the point where dozens of individual files are involved in the operation of a single application.

Also, with modern software, it is possible to licence functionality. For example, if there is a need to display results graphically, rather than building an application, it is easier to license a product like Crystal Reports. This application has numerous supporting files and requires the introduction of Object Linking and Embedding. The data used for charting needs to be held in a database structure. So we may elect to use a Jet database to store the data. This means that we need to use Data Access Objects and Object Database Connectivity. Each of these additional functions increases the interaction between multiple files.

5 API Calls

In the Windows environment, all of these files will be making Application Programming Interface (API) calls to return standard functionality, be it the borders around an application, the positioning or sizing of an image, or the keyboard characters being input etc.

6 Hundreds of files may be being referenced..

So an application, which may have started life as a stand-alone COM file in DOS, may be a pivotal executable file in Windows '98, with hundreds of other files being referenced and supplying information or functionality. There is further complication by the fact that the majority of these support files will also be used for other applications.

An example of this would be the worlds most popular Office suite of products - Microsoft Office. If we step back 5 years, Microsoft Office Pro version 4.3 included Word, Access, PowerPoint, Excel and some additional functionality. The number of files required to operate this version of Office was in the low hundreds. With Office 2000 Premium, we have the latest versions of the same base 4 applications as well as numerous other tools. However, the number of files has jumped to thousands, with common files being accessed by all of the major applications, including menu controls, open and save functions (API) Grammar, Spelling, Charting, Embedding, accessing databases etc.

So what is the significance of modifying a single file within this sort of environment?

Manually changing a function in a single compiled file is going to potentially have ramifications across all other applications related to that specific file. Lets say we find a simple formatting structure in VB code for a Visual Basic application and we change at the binary level a #,##0.00 to #,##0. In other words we have removed the cents component and we have complied with the rules set out for manually modifying a file. If the application is designed to simply total figures within itself, the only error introduced is that of inaccuracies due to the dropping of the cents. A simple cumulative summation error is going to occur. If this same application is linked to a database, then corruption will possibly be introduced into the accuracy of the data held within the table. If there is error trapping within the normal form views of the database that requires the cents to be retained and displayed, an error will be induced, which, as far as the VB application is concerned, never happened. In other words, due to the multiple ways used to access data, changing one access method will not only potentially corrupt the data and force incorrect calculations, but errors may be created in seeming unrelated applications.

7 Manual file modification is high risk

This means that manual modification to a file in todays Windows environment carries enormous risks. This is one of the reasons why modifying object code is becoming a dying art.

If there is an error needing correction at the object code level because source code is not available, we need to have a way to make the change safely. At the base level, we have to understand the environment in which an application is operating. Is it a standalone file, does it

have dependencies only in its own directory or does it have system dependencies as well? We also need to understand the function that has to be changed. Will it impact anywhere else? Is it only used internally in the file? What are the downstream effects of making a change?

8 A function must be globally tracked

We have now moved to a point where we have to be certain that what we do is followed all the way through its path. Thankfully all compilers use their own standard structures. The interfaces between files compiled using different compilers can interpret the information/call being made (these are invariably string based in one format or another) and computers can understand specific binary commands so it is possible to globally track a function through individual files. If done manually this will take a very long time to achieve and if a call is picked up by a file rather than being sent, then the likely hood of missing this is extremely high.

9 MFX search engine

To that end MFX Research has developed a tool, which at its most elemental level is a simple search engine, which will identify specific structures, be they binary text or hexadecimal. This tool allows for specific structures to be scanned for and located. For example I previously discussed the changing of the name of an application and the impact this may have on other files. If we take a simple example using Microsoft Office '95 and only look for Microsoft Word through the Word, Access and Excel sub directories, one would only expect to find this text in the Word directory, as this is what appears in the Word Title. What we actually find is:

 55 files with 307 occurrences of Microsoft Word, the majority of which are in template files.

 We also find as expected, occurrences in winword.exe. What we do not expect are the occurrences that appear in wwintl32.dll, wdvisex.exe, macrode.exe, msaccess.exe, xlvisex.exe and xlint32.dll. These results are only from scanning the 3 relevant sub directories. If a full PC was scanned there would be many more.

So if we were to simply change Microsoft Word to ABC Word within the winword.exe file then multiple errors are possible. As we are only looking for a simple text string and no allowance has been made for encryption or for a Unicode version of this string, the 307 occurrences identified may not be all of the references used. A further 20 or 30 structures may be missed . 10 The tool's main functions

The tool also carries a modify function. When substitutions are made, spaces are automatically inserted as required. The hexadecimal count is automatically calculated and the relevant inserts can be made at the relevant byte locations and the file is re-stamped with its original date time stamp. 11 An iteration of the MFX Research Detect & Modify algorithms

The tool has been named the Find and Replace Tool and it is an iteration of the Detect and Modify algorithms developed by MFX Research. It can be used for determining all of the various structures used by compliers and any common references or usages across a wide number of files.

In the example with Microsoft Word, the tool can identify and replace all instances of Microsoft Word with ABC Word across a whole system in a matter of seconds.

The tool can simply achieve a find and replace across the wwintl32.dll file. We look for Do you want to save changes to  and replace this with Are you sure you want to delete. This means that the next time a user closes Word without saving the document they have been working on, rather than being asked Do you want to save changes to Document1, they will be asked Are you sure you want to deleteDocument1. In other words, by changing the text in the message box, the document will be lost.

12 It is possible to change the functionality of an .exe file

In addition to this it is very easy to identify calls to DLLs at object code level. It is possible to switch the calls made so that incorrect functionality is returned.

The Find and Replace tool includes a graphical interface with buttons and list boxes. It is possible to target a specific file with modifications as outlined above and leave no trace, as all the files external characteristics will remain unchanged. This is a very simple use of the tool, but it introduces a whole new way for viruses to attack a system, and there is no tool available to detect any changes.

13 Changing security settings

This tool can be used for changing security settings. I will look at two different and relatively simple levels of security.

I do not want to appear to be targeting Microsoft, but with their market dominance at the PC level, most people are familiar with their products and it makes giving an example much easier.

Consider a Word document, an Excel Spreadsheet or an Access database. In all three cases we can protect a file by assigning a password, which means to open the file to view the data or information, we must start the application and enter the password. We have no such protection at the object code level. To gain entry to a password protected Access database all that is required is to create a database, enter some data and save the database. The next step is to assign a password and save the database to a different file name. A simple comparison is then made between the two. As the only difference between the files is the inclusion of the password function, the differences at the object code level will only be for the inclusion of the password function. By simply comparing the two and using a substitution engine it is possible to change the password on the database and then gain access to it.

Secondly, using the same basic theory but using a global Find and Replace tool the same can be said for any operating systems security system. If a user has standard user privileges and has access to the system files, then using similar methodologies not only passwords, but profiles may also be changed and again there is no way of identifying when the changes occurred, or of the changes being picked up automatically.

The Find and Replace tool is designed to only operate on closed files, but it is just as easy to operate on files loaded into memory.

All of the previous examples demonstrate simple substitution and all open up a whole range of potential security issues. Making files read only does not work, because it is a simple function to automatically change file attributes prior to modifying them and setting them back once the modification has occurred. This is done in a similar way to the way in which the last date/time access stamp was changed by the tool.

Is there a solution to these issues?

14 Y2K - the millennium issue

The answer is yes. It can be done through the use of simple indexing of the number of characters and their addresses within a file.

The Year 2000 issue is on everyones mind. In early 1997, when we were developing the methodology to allow a global modification to functions, a use for the technology was needed. An ideal application for this was in the Y2K arena for correcting the usage of short dates. To apply the Detect and Modify algorithms, all possible forms of short dates needed to be identified. The implications of changing a date structure had to be carefully identified and fully tested. Additionally, we had to ensure that date calculations were correct and that we only modified true date structures. 15 Over 43 thousand possible short date formats

Over the ensuing 10 months all possible date structures were created and then compiled using over 100 different compilers. Each structure was placed in all possible functional scenarios so that the hexadecimal representation could be identified. It was then modified using the Find and Replace tool to set up the ground rules for what could be modified and what could not be modified. This was initially only undertaken in the PC environment. Some very simple observations were made. These were that there are only 43,187 possible short date formats at object code level, regardless of the functionality of the date. There were only 24 date structures, which were not referenced as date structures. All of the 43,163 structures that were actually potentially referenced as date structures had a substitutable long date format.

For example the two most commonly used date formats at text level are dd/mm/yy and mm/dd/yy. If there are available bytes within the procedure like 20 (space) or 00 (Null) then these may be used and the structures become dd/mm/yyyy or mm/dd/yyyy. If there is no available space then the structures will become d/m/yyyy or m/d/yyyy. The loss of a d or

m makes no difference in the passing of date information, as it is only one of display, whether a leading 0 is displayed or not.

This library of known short date structures with the relevant long date structures was added to the base Detect and Modify algorithms and this allowed for full detection and replacement of all short dates which were used as part of procedural calls, functions, formats or definitions in the PC environment. This over the ensuing months was developed to a point where the library, in the most part, was restructured as an algorithm and re-incorporated with the main algorithm.

16 MFX 2000 is an iteration of DaM

So in essence the product known as MFX 2000, has taken a simple methodology for changing part of a file, through to an automated tool to identify all possible structures, which are referenced by innumerable files and applications, to deliver a global change in a single function, being short date.

The same methodology can be applied to any other function within a given environment. It is simply a matter of identifying all possible structures and ensuring that all substitutions will operate correctly.

17 Arguments against global modification

Arguments against global modification include:

 There is no guarantee of file integrity

 All calculations will be incorrect

 Interaction between applications will be disabled

 Data will be corrupted

18 More Arguments..

 New undocumented errors will be introduced

 The system will become unstable

 It is not possible to do this.

 My source code does not match my object code

 And many more

In answer to these:

19 File Integrity

There is no guarantee of file integrity - the integrity of the file is guaranteed as it has only had one function changed, which is still of the same functional sort i.e. it is still a date function.

20 Calculations

All calculations will be incorrect - In the case of dates the converse is true. This is what Y2K is all about. In the difference between 01/01/99 and 01/01/00, humans may assume that 99 means 1999 and 00 means 2000. However, a computer, which bases dates on its own system date (unless windowing is used), sees 99 as either 1999 or 2099 and 00 as either 1900 or 2000. In either case the return is -99. What humans assume and what computers logically return are two different things. By changing the calculation from 01/01/1999 to 01/01/2000, the answer is 1.

21 Functionality

Interaction between applications will be disabled - As long as a consistent substitution is made across all files there will be no disabling of files or applications.

22 Data

Data will be corrupted - The technology is aimed at compiled files, not data files. So whilst a table structure may be changed, the data is not touched in any way by this tool set. How the data is managed after the modification occurs is dictated by the functionality of the application supporting the table.

23 Error Introduction

New undocumented errors will be introduced - Possibly, but again with the case in point, date functionality is being replaced with date functionality, so it is hard to see where the errors originate. I would think that there would be a drop in errors as the date functionality is now being handled correctly.

24 System Stability

The system will become unstable - There is no reason for this to be the case. If the modifications made have been fully tested and more importantly understood by the developer, the system will not be unstable. In fact, the modifications may improve the stability.

25 It's impossible

It is not possible to do this - A standard reaction usually based on a lack of understanding of the basic principles being applied. NSTL, NATO, BSI and CTSC have certified this technology in the form of MFX 2000.

26 Source code / Object code

My source code does not match my object code. Ideally source code should match object code, but there are many instances where full versions of source code have been lost or there is not enough time to make the required modification or there are not enough resources available to carry out the modifications to the source code.

27 In summary

MFX Research has achieved the ability to modify globally one or many functions of multiple interactive files to correct or change specific functions.

Conventional wisdom states that this is not possible without source code.

We have proven that this is not the case.

-- Brian Bretzke (, November 24, 1999


thx Brian

-- tech time (chew@this.carefully), November 24, 1999.

Brian, not being a programmer, I am having difficulty understanding which files become corrupted due to the short date problem. After the computer is shut off, what would be corrupted (data files, program files, DLL's, etc)? One of the papers mentions errors that build up over time, but where do they appear on the hard drive?

-- Dave (, November 24, 1999.

Dave- I'm no programmer either. BSI, which has set the standards for Y2k compliance in both the US and England uses this stuff to fix their PCs. Can't help on the technical question.

-- Brian Bretzke (, November 24, 1999.

Moderation questions? read the FAQ