To the general public the proceedings of the RHVP Judicial Inquiry can appear to be as exciting as watching paint dry. And this is the challenge: involving citizens in an important issue that explains how our democratic institutions function versus how a democratic society should function.

No, this is not a story about the latest pop music celebrity. It is also not a story about your greatest sports team or hero. In fact it is about the boring issue of how we govern ourselves. How politicians act. How public administrators act. And how the boring reams of red tape and legislation perform. This is why the official news media, in essence, have ignored the proceedings of the Red Hill Valley Parkway Judicial Inquiry. Yet, what appears to be very boring, is in fact an opportunity to see what is beneath the peeled back layers of secrecy that cover almost all interactions of municipally elected politicians and their publicly-paid staff.

The issue is that the existence of a scientific report (The Tradewind Report) became exposed when a reporter from the Hamilton Spectator newspaper engaged in some digging into the actions of the City of Hamilton’s involvement with its Red Hill Valley Parkway (RHVP). The Tradewind report contained data from testing on the surface of RHVP and its conclusions were that the surface was below an acceptable standard of friction level. This led to the possibility that the incidence of collisions on the RHVP was influenced by this deficiency. City politicians claimed that they did not know about the conclusions of the Tradewind report. This led some to focus on the actions of one administrator, Gary Moore, to whom the Tradewind report was delivered. Did Mr. Moore properly deal with the report’s findings and file it? Did he lose track of the report, or did he purposely hide it?

Meanwhile plaintiff lawyers were successful in obtaining permission to launch a class action lawsuit claiming in the neighbourhood of $250 million dollars against the City of Hamilton for the collisions that occurred on the RHVP. Presumably this would focus on the City of Hamilton’s inaction in improving the RHVP surface conditions. Knowing these huge implications the City opted to create a judicial inquiry that would demonstrate that the City was being proactive in finding out how the Tradewind report became “missing”. On April 24, 2019 the City passed a resolution requesting the formation of the Judicial Inquiry. And so the judicial inquiry was formed.

Almost three and a half years later the RHVP Inquiry has now completed its phase of obtaining testimony from a variety of witnesses and experts. In total over 16,000 pages of testimony has been uploaded to the RHVP website. While this can be viewed as a demonstration of transparency it can also be viewed as a method of drowning any interested persons in detail. It would be difficult for anyone who has another life to examine all this testimony and retrieve the few nuggets that can be used to determine what happened with the Tradewind report.

Another criticism of the Inquiry lies with the narrow list of its participants. The Province of Ontario, the City of Hamilton, Dufferin Construction and Golder Associates were the only entities who were allowed to review the various source materials available to the inquiry. These entities hired their lawyers and it was only these select few who determined what questions would be asked and in what direction the inquiries go. At no point was there an opportunity for the general public to be represented via an independent entity that had no special interest in the Inquiry’s outcome. As a result questions essential to the public’s interest were likely not asked of the witnesses and experts.

Many issues have arisen from testimony and not all can be mentioned here. But here is an example.

Mr. Dewan Karim was one of the expert witnesses who gave his testimony on February 23, 2023, the last day of the testimony phase of the Inquiry. Mr. Karim described his traffic and collision analysis from the five years 2014 thru to 2018. From this he reported that the collision experience along the RHVP was no different than other expressways in Ontario. Through further questioning it became revealed that his analysis excluded “self-reported” collisions that would have originated from sources such as collision reporting centres. While his analysis discussed the 499 “reportable” collisions that came from police investigations, an equal number of about 500 self-reported collisions were excluded from the analysis. The following exchange is taken directly from the transcript of Mr. Karim’s testimony at the Inquiry:

EXAMINATION BY MS. HENDRIE:

Q. there’s 499 collisions that were excluded from your calculation of the total Red Hill collisions between 2014 and 2018?

A. If I understand correctly, you’re referring the self-reported data is excluded?

 Q. Yes.

A. That’s correct, yes.

 Q. But you’ll agree there’s 499 collisions that don’t make it into your total?

A. 499 is used for the collision rate analysis. The self-reported is not included in the collision analysis.

Q. Okay. Maybe we can go piece by piece. The number that the spreadsheet returns when you exclude self-reported collisions — or when you include self-reported 13 collisions is 1,003?

A. Yes, I think we agreed with that point.

Q. Yes, okay. But the number that you include in your total, which I know excludes self-reported collisions, that’s 504?

A. Yes.

Q. So the difference between that is 499?

A. Yeah, if you compare with the non-reportable and reportable or self-reportable, that would be the difference.

Q. 499 is about half of 1,003?

A. Roughly, yes.

 Q. You’ll agree with me that excluding the non-reportable collisions, you’ve excluded roughly 50 percent of the collisions that occurred on the Red Hill mainline in that five-year period from 2014 to 2018?

A. There is reason the non-reported is not included. I think I explained in the morning. If you want me to repeat, I can repeat that.

 Q. No, I’ve got your reason. I just want to talk about the numbers. So I don’t think you actually answered my question that you’ll agree with me that by excluding the self-reported collisions, there’s approximately 50 percent of the collisions on the Red Hill mainline that were excluded from the total?

A. Yeah, it was excluded because of the unreliability of the locations that is in the self-reported.

Q. But it was excluded?

 A. It is excluded for unreliable information.

Q. And you’ll agree with me that directionally, by excluding that 50 percent of collisions, that would also have the effect of reducing the Red Hill collision rate that you calculated by approximately 50 percent? If there’s half the collisions, half the rate?

A. That might be the way you’re looking at. I’m looking at the reportable collision perspective, which has far more detailed information which will be farther accurate compared to the non-reportable data which is, for example, Greenhill has a lot of non-reportable. If I include that, it will show the Greenhill section is much higher collision rate, which in reality that may be just an error of coding or whoever information is provided. So in general, professional practice, whether it’s Ministry of Transportation,  City of Toronto, where I work all my professional life that I worked on all types of collision, we make a decision based on the reportable collision data, not always –

Even persons with the least experience with drawing conclusions from numerical analysis should have difficulty with this exchange. How can you reach a reliable conclusion from your data when 50 per cent of it contains “unknowns” or statistical error? The easy way to do it is to pretend that those data points containing “unknown” simply did not exist. And Mr. Karim informed the Inquiry that, as a result of this acceptable methodology the collision rate was .69 per million vehicle kilometres for the northbound direction and .43 for the southbound direction.

Mr. Karim defended his methodology by referring to the Ontario Ministry of Transportation (MTO) which, he claimed, does not use self-reported collision data in its analyses:

Usually in my 26 years of data dealing with MTO, I have never seen the Ministry of Transportation disclose any self-reported data for professional use, so it’s extremely rare, and I have never used or received the self-reported data.

This tabulation of collisions along the Lincoln Alexander and Red Hill Valley Parkway was developed by the Hamilton Spectator News paper from data supplied by the City of Hamilton. It was displayed in an article published by the Spectator on July 17, 2017. Did this data include the numerous self-reported collisions? How would anyone know? And why should anyone ask if such basic information was not explained by the City of Hamilton?

And this raises an additional concern. Collision reporting centres were created in Ontario in 1994. So since that time a percentage of collisions occurring in Ontario began to be reported by those involved in them, not by police who are presumed to be independent parties. Furthermore, many of the facts reported in police reports are not reported in self-reported collisions. So since 1994 the reported characteristics of a large number of collisions became unreliable. Yet every year, for decades, the Ontario Ministry of Transportation has published the Ontario Road Safety Annual Report (ORSAR) which is described as an accurate reporting of the status of road safety in Ontario. But how accurate is the ORSAR? With respect to the vast number of self-reported collisions, are they included in the ORSAR analysis or are they simply ignored like Mr. Karim’s analysis? If they are included in the ORSAR, how is the likely inaccurate content of these reports dealt with?

Since 2015 the threshold for collision reporting has changed. Where that threshold had been $1000 in damage, it was increased to $2000. So this change should reasonably have resulted in a reduction in the number of reported property damage collisions between the years 2015 and 2016 in the ORSAR data. Property Damage Only (PDO) collisions should represent a vast percentage of all collisions. But that did not happen. The numbers below show the rates of collision occurrence (per 100 million travelled kilometres) reported in the ORSAR between the years 2004 and 2019 (with a few years missing):

2004 = 189.67

2005 = 184.06

2006 = 165.84

2008 = 183.84

2010 = 166.26

2012 = 135.85

2013 = 142.51

2014 = 161.22

2015 = 159.32

2016 = 147.75

2017 = 145.22

2018 = 145.64

2019 = 152.96

Note that there is no inflection point between the years 2015 and 2016 where the number of reported collisions should have been drastically reduced due to the threshold change. So what happened? Did the authors of the ORSAR fudge the data? Or do they explain this through a methodology of “industry accepted massaging of the data”?

But there’s more. Whether collisions are officially reported or self-reported they still represent a small number of incidents compared to those that never get reported. One can look at collision severities as a pyramid where the very small top is made of fatal collisions. Below that are the larger number of injury-producing collisions. And below that are the larger number of property damage collisions. But there is a larger number of unreported collisions at the base of this pyramid and this layer is never revealed. Experts will claim that the numbers of such minor collisions do not need to be known because they are of minimal consequence. But is that truly the case? That additional layer may provide additional clues as to what factors may be causing collisions.

How many minor collisions or incidents occur that never become reported? Research on this issue has been conducted by Gorski Consulting since 2009 at a site in London, Ontario where physical evidence of minor incidents has been documented on a frequent basis. This data has been compared to the official data of collisions collected by the London City Police. In a Gorski Consulting website article posted in April of 2018 (“Historical Patterns in Loss-of-Control Events At Specific Road Locations”) it was shown that “more than 80% of loss-of-control collisions and incidents did not exist in police records of reported collisions”. If such numbers of unreported incidents existed in the RHVP data what does that say about the official data on which experts rely to support their conclusions about collision causation?

The reality is that, when you do not collect reliable data on a very large percentage of collisions the accuracy of the conclusions you draw from that data becomes suspect, at best. So when experts claim that they have a good appreciation of the collision rate on the RHVP those claims must also be suspect. And this is not helpful in coming to an understanding of how the surface of the RHVP may have affected the incidence of collisions.

There are many issues that have cropped up resulting from the testimony of the various witnesses in the RHVP Judicial Inquiry. Only one issue has been reported here and even that has led to a long article that threatens the interest of the reading public. Unfortunately details are necessary to prove a point yet details can sometimes be boring. For this reason no further discussion with be attempted at this time.