The BankOfMontreal Mechanisation system is a home-grown banking system designed to support the on-line banking operations of the Bank Of Montreal. It provided real-time transaction processing, as distinct from memo-post, which is still the mode of the other banks of the Canadian "Big Five", all of whose systems are customized versions of the COLT (Canadian On-Line Teller) package.
This page is the current resut of an initial collaboration effort by PaulMorrison and HansWobbe. Others are welcome to participate, by either adding additional information or posing questions that may help draw out points that are of interest to members of this community.
- Aspects of this centralized approach to posting that might be of interest in this community include:
- Five concurrent business days - the then 'amazing' ability to span time zones and even the International Date Line.
- The likely 'first' of having a 'Year2000' production problem as early as 1985.
- Huh? BTW I tried to persuade them that we ought to fix the problem we are going to have in 2100, but they didn't expect the system to be around that long...
- One of the apllication systems calculated interests rate payable over the term of the instrument by subtracting the maturity date from the current date. A 15 year instrument processed by that system in the summer of 1985 caused an abend since (20)00 was less than (19)85.
- 'Pre Date cut off' - allowing yesterday's keying errors to be fixed up until 10:00 am. A great cost saving idea.
- Month end trigger processing - recalculating the month end statement balances for accounts that changed after the statement was produced, but before it was printed.
- Would it enhance the interest level of this page if we posted some of the more remarkable differences between the American and Canadian banking systems that 'Constrained' the development and evolution of Banking systems in these respective domains?
- Good idea! Outside my area of expertise, though!
- The Pre Production Test Facility that was a complete emulation of more than one month's transactions since the application environment was far too complicated to just test with traditional approaches.
Possible aspects of
BmoMech that also may be of particular interest here at c2:
- Transaction processing speeds - there was a limit of about 100 TPS that, surprisingly, was more related to memory use than processing power.
- BmoMech is one of the few 'invented here' systems'' - The other four of the BigFiveBanksOfCanada
- (the ...Ca anticipates that this may be a seed thought that grows into a global discussion of banking systems (''next time, wait till you need it. also, Ca made me think of California, much better to use a longer MeaningfulName - )
- I agree: thanks for making the change - we will delete this side-conversation next time'')
- all bought IBM's COLT system and modified it. Bmo built from scratch! (The exclamation mark is warranted, as anyone who understands the difference between writing a letter and editing someone's draft, can attest.)
- BmoMech was designed for real-time posting of transactions, directly to the GL. - In the early 1980s, the average Mech transaction had a path length of some 48,000+ AssemblyLanguage instructions.
- An important innovation was the idea of making logic "option-driven". RaMcDougall [converted to WikiName format] (usually referred to as the RAM) realized that we would eventually need to have many kinds of hybrid account (e.g. checkable savings, monthly statements vs. passbooks, interest-bearing checking, etc.), and logic should be driven by option switches. In comparison, most banking systems up to that time, including COLT, had different account types: checking, savings, loans, etc., and still have trouble accommodating all the different kinds of offerings available today. I find it interesting that many discussions of OO still treat different account types as subclasses, while the RAM foresaw the problems with this 30 years ago!
- How could I forget to mention that almost all of BmoMech's batch processing was built using AMPS, the first implementation of FlowBasedProgramming?! Both AMPS and the application logic were all done in S/360 AssemblyLanguage, as the RAM was concerned about the performance of HighLevelLanguages, and the reliability of compilers - also very prescient! Later, the Bank had a flirtation with PL/I (see PliLanguage), and built a reporting subsystem using it, but, approximately 25 years later, this subsystem has turned out to be a lot harder to maintain than the vastly more powerful AMPS portion of the system. I may add more detail as to why this should be so in this or another page, but maybe readers of this wiki have already figured it out!
- Reading the page on TableOrientedProgramming reminded me that Mech is heavily table-driven. Typically, these tables are compiled using table-building macros, so they are fairly easy to maintain. Later in the project's evolution, under the leadership of John A., we tried to give all new tables a normalized relational form. Older tables often had more complex structures, which meant that new programmers had to learn the structure of each such table separately, and they had to have custom-written search code, rather than being able to be accessed using a standard binary search facility.
- Paul: I know that COLT was originally just 'memo-post' as opposed to 'real time', but I thought that the other four of the Big Five had enhanced their implementations to provide real time posting, at least of balances. Even this, of course, would still be a long way from the way bmoMech posted right down to the final GL level in the accounting system. -- hwo.
- I know we refer to these systems as 'memo-post', but I tend to think of their main architectural characteristic as being the fact that they rebuild the online database every night. So the offline system is really the 'master', while the online is the 'slave'. This has the advantage that performance does not have time to deteriorate significantly due to database adds and deletes, but it has the very real disadvantage that you have to get the online database rebuilt during an ever-diminishing "batch window", which in turn is becoming even smaller due to 7/24 pressures. While I was at CIBC, they added logic to alternate two copies of the online database, so even if one rebuild run failed, you could revert to the previous night's version. Assuming the Royal does not do this, this would partly help to explain their recent problems.
An important result of the design objectives and the resulting architectural differences between Mech and COLT was that Mech was entirely centralized while COLT used regional processors (three at RBC at the time). This created a real business opportunity that was exploited by Bmo for CashManagement
?. At the time, the cost of a Mech transaction was thought to be about $0.095, as opposed to RBC's $0.09 or about a 5% disadvantage arising from the fact that local processing within a region could be a bit more cost effective. This insight was exploited by realizing that when a transaction crossed a regional boundary, COLT's costs went to $1.12+, while Mech's stayed at $0.095. It was a genuine pleasure to sell cash management services in competition to RBC, after they stole our thunder by announcing that they too had implemented a cash management system, within 24 hours of the Bmo (the FirstCanadianBank
?) announcement.
As I understand it, BmoMech can be used for technical detail (and maybe some neat anecdotes) on the Mechanisation system, whereas BankOfMontreal could be used for more general topics (or vice versa, Hans?).
- That's the way I see it too, which is why I posted on the Bank... page that I think we should start moving some of its contents here.
Paul: When Bmo finally converted from the 3330 DASD class disk drives, I was told that this was a most demanding task because in involved replacing a lot of logic that wrote to a physical disk location to achieve better write speeds, rather than using a higher level mapping. Can you confirm this? -- hwo.
- The only thing I can think of is an AMPS module that I wrote to read a track at a time, which did a number of chained reads, without waiting for the index point - the module sorted out the record sequence. It was very fast! This was used for some of the DBI scans (described in one of my papers). When I went back on contract a couple of years ago, I found they had reinserted the search/TIC loop - hardly rocket science! Correction to an earlier statement: All the on-line database access stuff used BDAM, which was later converted to use EXCPs for performance reasons (after my time). I am told this (changing to the new hardware) didn't require any code changes, except for some block size changes to optimize use of the new hardware. However, I remember there was also Richmond's reorg code, which may operate closer to the hardware - Richmond likes to code on the bare metal! -- pm
- Thanks for reminding me about AMPS. I do indeed recall how impressed everyone was with its speed. I didn't fully appreciate it myself since I tended to work as a user of the technology, but I do know the transactions that were created for Direct Line benefited from AMPS.
To my personal delight, RaMcDougall not only 'found' these pages, but has also expressed his support for expanding their content. Interestingly, his comments reinforced my opinion that the MECH 'project' has a lot more significance than has been generally recognized. This leads to a number of possibilities:
- More effort should be put into explaining the importance of these developments,
- Material at this this site should, however, be 'filtered' to ensure that it is 'on topic' and appropriate to this site's mission.
- A number of key players should be invited to contribute their knowledge, insights and perspectives. After all, an undertaking of BmoMech's scope is sufficiently rare to warrant documentation.
- It would be interesting to try to rank some of the other development projects, using appropriate metrics, if only to form some objective opinions about how large a project can be and still succeed in spite of concerns such as those in the MythicalManMonth.
- ...
--
HansWobbe
October 14, 2011: I'm considering putting my recollections of MECH history on-line; anyone interested in participating?
-- R. A. McDougall? (aka RAM) <RAMACD@gmail.com>
ToDo:
- Expand the explanation of how the BankOfMontreal screened its staff to identify those that could learn to program, when faced with the problem of building BmoMech in the 1970s.
- Consider explaining why this could not happen at this time; expanding on the reasons in a way that supports some current conclusions regarding the effectiveness of "large" corporations (and especially those in Financial Services).
- Add a link to the "announcement movie" TomP gave WcH.
what about MECH 2980 format?