A Forty-Year View

I recall that it was on my late father’s birthday, December 6, 2014 that I showed up for the first time to a weekly gathering of free open source geeks at a pub in Ottawa. Like the others around the table, I was on a mission.

For over four decades my fintech company, which I started when I was in my 30s, had been doing very well supplying a fast, high volume transaction processing solution for several of the world’s largest banks, credit card companies, insurance firms, equities traders, and other types of financial services. Our mainframe system could check millions of transactions per second against an in-memory data store with 4,500 tables of digital rules. Globally, somewhere around 6 billion transactions a day were being validated against these electronic rule sets. We were going strong, and I was retired as CEO and remained on as CTO of the firm.

But I had just turned 70! And it was on my mind that the problem, which was my initial motivation for designing an extremely fast, high volume transaction processing solution, was still not resolved!

By the late 1970s it was well-understood that we had a global environmental problem. Rachel Carson's "Silent Spring" was published 1962, Barry Commoner’s Science and Survival was out 1967, followed by The Closing Circle in 1971. James Lovelock and Lynn Margulis published the Gaia hypothesis in 1974.

I concluded shortly after my attendance at the First Global Conference on the Future held in Toronto July 20-24, 1980 , that a time would come that would require the systematic moderating of unbounded ecological stresses and resource exploitation. It is worth taking a look at a short outline of the conference.

Find a conference retrospective in Looking Back on the Future.

My mission back then was to address solutions described by the “Tragedy of the Commons” in its many manifestations: pollution, loss of habitat and biodiversity, human disruption of atmospheric conditions.

As far as I knew, no systems other than jurisdictional taxation, existed to be effective for this level of operational complexity. However, they were much too crude in their current unautomated implementations. To solve this multi-economy problem, I felt that the world needed a type of fiscal system capable of applying up-to-date environmental constraints (rules) to every transaction, in every market. It would be prudent to be ready.

As luck would have it...

I reckoned that I had a bit of background on how to design that.

The IBM360 Mainframe was ushered in as I graduated from University of Toronto in 1969, having been taught FORTRAN by Dr Ralph Stanton, Dr. Ken Fryer.

In my post graduate studies, Professor Sheldon J. Glasser taught me “Data Processing Concepts and Techniques” for solving IT processing problems of the day. From him I learned the framework for representing logic through tables.

I’m going to repeat that, because it’s important: Shelly taught me how to operationalize logic through tables.

We became colleagues and friends, and together with Wayne Cunneyworth and a growing team over the following years we collaborated on the development of tableBASE™, a robust commercial In-memory database written in a Machine Language Assembler for IBM Mainframes.

Wayne did an excellent job explaining our underlying tabular processing method in a 90-page technical paper that we published in 1994: “Table Driven Design: A development strategy for minimal maintenance information systems”. Shelly’s untimely passing on March 24, 1999 left a huge gap, just as some of our largest global clients began implementing tableBASE for their mission-critical data processing.

Having worked on several complex solution designs for the Canadian and some provincial governments and the worlds largest banks, and then into my 40s, I also tried to move forward on applying this efficient method for data structuring and processing to enable feedback to the population about the state of our world, and to create a dynamic fiscal framework to counteract issues through system-level automation. I was invited to speak to this topic to the newly formed European Union, and to US and Canadian Taxation authorities in early 1990. Full of confidence, in 1999 I submitted a proposal to the OECD Advisory Commission on Electronic Commerce, entitled; “Adaptive Technology: A Foundation for Automating the Taxation of E-commerce.

Alas, political will was just not there. And since it is governments that control taxes, there did not seem to be much more that I could do from the private sector.

Fifteen years rolled by rather quickly. Well by the time I turned 70, it was clear that the free/libre/open-source way of getting creative things done was cutting through inertia in many domains. So I started asking around about who might be able to take my idea and run with it, using that licencing and development model.

When I met Joseph Potvin at the pub in 2014, I was very exited to finally meet an economist who understood my mission, and moreover, my rules-as-data solution to the informatics problem. We had each pursued fiscal system redesign two decades earlier. My functional system design was called a “One-Pass Tax”, and his incentive system design was called a “Resource and Ecosystem Degradation (RED) Tax, part of his broader Earth Reserve Assurance framework.

Without delay we decided to combine our efforts. Between us we reckoned that we could pull together the know-how and people to advance a viable solution using the free/libre/open source model. We incorporated the not-for-profit Xalgorithms Foundation in January 2016, described the essential features of a general-purpose Internet of Rules, and began to assemble a community of talented and dedicated contributors. The new design would benefit enormously from concurrent advances in decentralized, discributed computing methods. Our operational prototypes had to be done and re-done and re-done as new methods became available. I’d like to thank Don Kelly, Calvin Hutcheon and Ryan Fleck for many hours of creative effort, sticking with us through multiple reformulations!

Along the way Joseph synthesized the core conceptual foundations into a doctoral thesis at University of Quebec, entitled “Oughtomation: Practical Normative Data”. Professor Stéphane Gagnon has provided consistent, capable research guidance.

At present, various community members are creating three distinct reference implementations of the oughtomation method. This gives me confidence that we are assembling something with immense promise. Craig Atkinson deserve special mention for his insights into how the emergent general-purpose Internet of Rules capability that we’re enabling can be harnessed to bring efficiencies to cross-border trade. His pro-active global outreach has brought into the process people from wide range of business, law, government and civil society organizations. In closing, what I find particularly gratifying is to be working with bright, motivated minds, people from a plurality of perspectives, who are committed to joinly changing through mutual interest “What Is” in this world, to what “Ought To Be”.

You too are welcome to bring your perspective and passion in this ongoing effort!

William Olders

Drawing upon 30 years as the founding President and CTO of a firm that specializes in high-volume rule-based transaction processing for several of the world's largest banks, credit card companies and insurance firms, Bill serves as co-founding Chair of the Xalgorithms Foundation, providing technical guidance on tabular declarative programming methods.