Articles by: Simon VodreySimon Vodrey
Simon Vodrey is an Ottawa native and a PhD candidate at Carleton University’s School of Journalism and Communication. He also holds a Master’s degree from that same institution. Simon’s journalistic and research interests include politics, journalism and communications history, automotive design, and crisis communication.

When Labels Aren’t Just for Soup Cans & Rolodexes Just Aren’t Enough

March 15, 2016 12:00 pm
When Labels Aren’t Just for Soup Cans & Rolodexes Just Aren’t Enough

South Carolina is an American state whose reputation for political trench warfare precedes itself. Inside a crowded North Charleston auditorium as he stumped for his brother, Jeb, in South Carolina, former president George W. Bush reminded his audience that, while “There seems to be a lot of name calling going on but, I want to remind you what our good dad told me one time: labels are for soup cans.” However, when the polls closed, the results indicted that, contrary to former president George W. Bush’s words, labels weren’t just for soup cans.

As the 2016 Republican South Carolina primary ended, Jeb Bush had a less than stellar finish. In fact, he came in fourth place with approximately 7.8% of the vote. Donald Trump, the larger than life businessman and media personality, won the South Carolina primary by receiving a robust 32.5% of the vote. The two junior U.S. senators in the Republican race, Marco Rubio (from Florida) and Ted Cruz (from Texas) trailed Trump by 10% but had a photo finish between themselves for the race’s second and third place positions with Rubio ultimately winning 22.5% of the vote and Cruz winning 22.3%. In sum, even the former president’s eleventh hour return from the political abyss to campaign for his brother proved to be no more than an exercise in futility.

Republican Presidential Frontrunner Donald TrumpBy the end of that fateful evening, Jeb Bush threw in the proverbial presidential election campaign towel: he suspended his campaign. And thus the possibility of a third Bush moving into the White House in 2016 was no more. The ultimate establishment Republican presidential candidate lost to an opponent who — because of his inflammatory comments, boisterous behaviour and sparse policy details — is shaping up to be the ultimate Republican anti-establishment candidate.

As such, Trump wasted no time proving that labels weren’t just for soup cans in South Carolina. His long-standing labeling of Jeb as “low energy” proved itself to be a surprisingly effective tactical trope on the road to South Carolina. Effective indeed, especially since, as many cynical media critics aptly noted, Trump incorporated a triumphant escalator ride from the lobby of Trump Tower into the building’s atrium and food court, the site of his presidential announcement which occurred more than half a year in advance of South Carolina’s primary.

By the time that the Bush campaign realized that the Trump campaign wasn’t merely a political sideshow that could easily be dismissed as little more than the flavor of the week, it was too late. The Bush campaign created numerous electoral scenarios, but none predicted the visceral and systemic anti-establishment momentum that Trump’s campaign reflects. Even worse, Jeb Bush and his campaign failed to alter their strategies once it became clear that Donald Trump was morphing into a conservative populist juggernaut — albeit one whose rhetoric has largely been interpreted as initiating a state of civil war within the Republican party, stoking fears that his divisive campaigning could torpedo the prospects of the GOP winning the White House not just this November, but also for the foreseeable future.

Mary Matalin, a well-known senior Republican political insider, once said that “the maximum goal in professional politics is ‘No surprises.’” As was implied earlier, that mantra cannot be applied to the 2016 Republican primary race — both up to and then beyond South Carolina. But you would be hard-pressed to find tangible evidence that the Bush campaign playbook took note. In fact, the Bush campaign playbook appeared to be set in stone and, accordingly, was anything but agile. Therefore, when surprises on the campaign trail occurred, the Bush campaign was unable to counter its opponents’ messaging and lacked an action plan to get its own candidate’s message out to voters. Consequently, the Bush campaign broke what Matalin called “Cardinal rule 101 of politics: Never let the other side define you.”

MYRTLE BEACH, SC - FEBRUARY 11: Shadows are reflected on an American Flag as people line up to speak with Ohio Governor and Republican presidential candidate John Kasich at a restaurant in South Carolina following his second place showing in the New Hampshire primary on February 11, 2016 in Myrtle Beach South Carolina. Kasich, who is running as a moderate, is expected to face a difficult environment in South Carolina where conservative voters traditionally outnumber moderates. (Photo by Spencer Platt/Getty Images)

Photo by Spencer Platt/Getty Images.

To make matters worse for Jeb Bush, money didn’t talk. The Bush campaign ran the epitome of what many call an “insider” campaign fundraising strategy. Simply put, an insider campaign fundraising strategy can be classified as as a form of Rolodex fundraising in which fundraising is reliant upon connections and contacts and wherein the donations raised are raised in large increments. Front and center here are the so-called Super PACs which, in a nutshell, are recently developed fundraising committees that are legally entitled to raise and then spend unlimited sums of money accumulated from individuals, corporations, associations and unions to publicly promote (or dispute) political office seekers and incumbent officeholders. The catch is that these Super PACs can neither coordinate with the candidates they are advocating for or against, nor can they donate their funds to said candidates.

But for Jeb Bush’s campaign and the Jeb Bush friendly Super PACs, there was another catch: the approximately $150 million dollars they raised was largely in vain. Much of that money was spent on crafting and then disseminating political advertisements which simply fizzled. The labeling of Jeb Bush as “low energy” coupled with the fact that Jeb Bush’s campaign largely failed to alter its campaign tactics to effectively address the insurgent Trump campaign and the fundamental anti-establishment sentiment in this presidential election cycle ensured that, even before the South Carolina Republican primary signaled the death knell of the Bush campaign, the writing was already on the wall. In essence, the lesson learned from the failed 2016 Jeb Bush presidential campaign is this: when running for president, labels aren’t just for soup cans and Rolodexes just aren’t enough.

The Seventy-Fifth Anniversary of Facing the “Flying Peril”

November 1, 2015 12:09 pm
The Seventy-Fifth Anniversary of Facing the “Flying Peril”

In 1934, two decades after the outbreak of the First World War and five years before the onset of the Second, a prescient former British soldier and politician named Winston Churchill spoke about the threat posed to England by air warfare. Churchill remarked that, “The flying peril is not a peril from which one can fly. It is necessary to face it where we stand. We cannot possibly retreat. We cannot move London.” A mere six years later, in the summer and early autumn of 1940, both Churchill and England herself would face such “flying peril” in what would become the Battle of Britain. This year marks the seventy-fifth anniversary of that David and Goliath struggle.

When Germany invaded Poland in early September 1939 signaling the start of the Second World War, it was abundantly clear that British Prime Minister Neville Chamberlain’s appeasement-based foreign policy was built upon a foundation of sand. Chamberlain’s earlier — and ultimately infamous — declaration of “peace for our time” with Hitler’s Nazi Germany was not to be. As the invasion of Poland commenced, Churchill’s self-imposed exile in the political abyss ended. He was quickly appointed to the familiar position of First Lord of the Admiralty, the same title he had held throughout the early years of the First World War. By the time that Hitler’s Luftwaffe (the German air force) was preparing for its colossal air-based assault on England that summer, Prime Minister Chamberlain had resigned and Churchill had been appointed as his successor.

London Burns in September 1940

London during the Battle of Britain in September 1940.

Launching the Battle of Britain in July of 1940, the German goal was to simultaneously overrun the Royal Air Force (RAF) — and in particular the RAF’s Fighter Command — thereby softening up Britain’s defenses and breaking the spirit of the British people prior to the staging of Hitler’s large-scale, long-planned amphibious invasion and occupation of Britain codenamed “Operation Sea Lion.” When the first squadrons of Luftwaffe aircraft warmed up and taxied for takeoff just across the English Channel in recently conquered Nazi-occupied France, the numbers seemed to be in Germany’s favor: The Luftwaffe had a nearly 2,000 aircraft advantage over what (on paper at least) was the RAF Fighter Command’s paltry inventory of some 600 combat-worthy aircraft. But a combination of factors ensured that the numbers did not remain in Hitler’s favor for long. The innovative English “Chain Home” and “Chain Home Low” radar systems for detecting air traffic crossing into English airspace, in conjunction with the vigilant individuals of the Observer Corps who tracked and relayed the coordinates of inbound Luftwaffe aircraft to the RAF once the Nazi aircraft had entered English airspace played a key role since both the Chain Home and Chain Home Low radar detection towers were all pointed outside of Britain and were unable to pivot to monitor inside English airspace. Also important was the “survival instinct-induced” industrial expansion in the production of military aircraft and equipment. And we cannot overlook the relentless efforts of the skilled pilots of the RAF — not to mention the approximately 100 Canadian pilots who flew with them.

But you wouldn’t have known this during the early stages of the battle. The Channel skirmishes of July and August allowed the Luftwaffe to wreak havoc on the supply convoys that were keeping the island nation’s economy afloat and gave German pilots the opportunity to gain a proverbial “birds eye view” of the English coastal defense mechanisms prior to striking their first aerial blow inside official British airspace. These channel skirmishes also further depleted the RAF of both men and combat-worthy aircraft. Britain was in peril.

By mid August, Hermann Goring, Hitler’s close confidant and designated commander of the Luftwaffe, decided that the time had come to carry out the true aerial assault of Britain. Knocking out the aforementioned “Chain Home” and “Chain Home Low” radar systems was to be the first step in the attack. But this proved more difficult to accomplish than was anticipated. The radar towers and their infrastructure were surprisingly robust despite repetitive aerial dive-bombing and strategic bombing attacks. In light of this setback, Goring ordered the Luftwaffe to focus its efforts on the RAF’s airfields rather than on the radar systems which — much to his frustration but to Britain’s benefit — remained close to fully operational, as did Britain’s telecommunications network. England was neither deaf nor blind in the face of the daily aerial onslaught of Luftwaffe fighters and bombers. This proved to be a fatal error for both Goring and Hitler in the air war.

In addition to attempting to neutralize the RAF’s airfields, the Luftwaffe, under Goring’s command, focused its efforts on destroying Britain’s aviation factories as well as those providing ancillary services and products to the aviation industry. And it was also during August that Goring prioritized the bomber protection (or escort) aspect of the Luftwaffe’s fighter squadrons. Rather than hunt the less advanced and outnumbered RAF fighters, many of the Luftwaffe’s quick and maneuverable fighters were relegated to bomber escort duties, a less efficient task in terms of destroying the RAF’s aircraft and the skilled pilots who flew them.

As the “scrambling” of the RAF’s venerable Hawker Hurricane fighter planes and the newer, faster Supermarine Spitfires became a more regular occurrence throughout August and into September, the ferocity of the dogfights between the RAF and the Luftwaffe increased in intensity and frequency. But the earlier manpower and equipment advantage enjoyed by the Luftwaffe began to evaporate. British radar systems working hand in hand with the telephonic relaying of Luftwaffe aircraft movements by the Observer Corps increased the efficacy of the RAF’s combat response to “the flying peril” described by Churchill so many years earlier.

Britain’s robust radar networks and the diligence of the Observer Corps were key in eliminating the element of surprise and preserving the speed of counterattack — two essential components to successful aerial dogfighting. Furthermore, Britain’s explosive production of replacement fighter aircraft during the summer of 1940 helped deplete the earlier German aerial advantage. Despite relentless aerial bombardments, the monthly production of RAF fighter aircraft increased by a staggering 60%, providing the RAF with over 1,700 new combat-worthy aircraft. By the fall of 1940, the RAF was able to replace its aircraft and pilots faster than the Luftwaffe could. The Luftwaffe’s earlier advantage in the proverbial numbers game was coming to a close.

Contrail Webs from a Dogfight During the Battle of Britain

Contrail webs from a Dogfight during the Battle of Britain.

In light of this fact, in September 1940, Goring changed the Luftwaffe’s tactics yet again. Rather than continue prioritizing the bombardment of airfields, the Luftwaffe’s daily formation bombing would focus on England’s cities and towns including, most notably, London. Beginning in early September, during a period that would come to be called the “Blitz,” London along with many of England’s other cities faced daily nightly bombardment since it had become clear to Germany that the daylight bombing runs were proving too costly. The Luftwaffe’s cumbersome, outgunned and under-armored bombers that pelted the English cities with bombs and incendiaries were generally forced to carry out their attacks without fighter escort, therefore increasing their vulnerability to the RAF’s fighters and to the English anti-aircraft batteries. The Luftwaffe’s deadly workhorse of a fighter, the fearsome but thirsty Messerschmitt Bf 109, was unable to carry enough high-octane fuel to enable it to fly from occupied France across the English Channel and as far into England as London and back again. Fuel economy was becoming one of the biggest foes for Hitler and Goring’s Luftwaffe.

With October approaching, although the air raids went deeper into the heart of England, Luftwaffe bombers were being shot out of the sky with greater ease than before, given their lack of fighter escort. The numbers were now turning to Churchill and Britain’s favor. By the second half of September, with the Luftwaffe facing repeated setbacks in their hubristic attempt to assert air superiority over Britain, Hitler delayed Operation Sea Lion, the proposed amphibious invasion of England, until at least the spring of 1941. As the bombing raids over London and many other English cities grew more sporadic and less deadly throughout October, the Battle of Britain wound down and Operation Sea Lion would ultimately never be launched.

By Halloween 1940, the Battle of Britain had officially come to a close. Britain’s RAF had held off the aerial onslaught of the German Luftwaffe. Germany was unable to gain air superiority over Britain thereby making an amphibious invasion a likely exercise in futility. The grave reality of the “flying peril” that Winston Churchill had warned of had been confronted and ultimately subdued by Churchill himself as Prime Minister, by his government, by the English people and by the men and women of the RAF.

An Unwelcome Guest Returns to Formula One

July 27, 2015 2:01 pm
An Unwelcome Guest Returns to Formula One

Jules Bianchi in his Marussia formula F1 car.

This month, after a 21-year hiatus, an unwelcome guest returned to Formula One racing.

Crash 1

Jules Bianchi’s crash at the 2014 Japanese Grand Prix.

At the rain-soaked 2014 Japanese Grand Prix on Sunday, October 5, Marussia F1 driver, 25-year-old Jules Bianchi, a native of Nice, France, lost control of his car and slammed into the back of an industrial mobile crane that was removing the crashed Sauber car that German driver Adrian Sutil had walked away from on the previous lap. Bianchi’s Marussia rounded Dunlop curve and hit the back of the crane at close to 100 miles per hour, generating enough impact force to temporarily make the crane leap into the air and off the edge of the soggy track. Bianchi suffered a severe traumatic brain injury known as diffuse axonal injury which had left him in a coma since the October 5 crash. On July 17 of this year, some nine months later, Bianchi succumbed to his injuries.

Bianchi’s death marks the first death in Formula One since the infamous 1994 season, a season that was particularly dangerous due to mandated changes in the Formula One rules which eliminated the use of drivers’ aids such as ABS (anti-locking brakes), adaptive suspensions and traction control, to name but a few, in an effort to increase competition amongst drivers. The intent was to emphasize driver skills and eliminate what was widely perceived as an unhealthy reliance on technical aids that were supposedly making the cars easier to drive and therefore limiting competition among the drivers themselves. When it came to technology, the less is more approach to racing in the 1994 season would prove fatal, claiming the lives of two drivers: Austrian rookie Roland Ratzenberger and three-time Formula One world champion Ayrton Senna, both of whom were killed in high speed crashes that occurred only one day apart at the 1994 San Marino Grand Prix in Imola, Italy. Ratzenberger’s crash took place on qualifying day, Saturday, April 30, whereas Senna’s occurred the following day during the Grand Prix itself.

Crash 3

Roland Ratzenberger’s 1994 fatal crash.

Unlike Bianchi’s collision, both of the fatal crashes in 1994 occurred in dry conditions. During qualifying for what would have been his second ever Grand Prix race, the damaged front wing on Ratzenberger’s Simtek car broke off, leaving the car uncontrollable due to a lack of sufficient downforce and causing it to careen into a wall at a blistering 196 miles per hour. The very next day, Senna climbed into his Williams car for the last time. Tucked into the cockpit of Senna’s car was an Austrian flag that he planned on waving during his victory lap as a sportsmanly form of honoring the recently departed Ratzenberger. That opportunity never came.

On the seventh lap of the San Marino Grand Prix, Senna slammed into a wall at approximately 135 miles per hour and fell victim to a fatal combination of circumstance. The sheer force of the impact itself had thrust one of the Williams’ front wheels into the cockpit, violently forcing Senna’s head back into the headrest and fracturing his skull, and the front suspension had broken apart on the crash sending shrapnel through the visor of his helmet. By the time he was extracted from the mangled car, Senna had lost all brain activity. By that evening he was dead.

In the immediate aftermath of the deadly 1994 San Marino Grand Prix, rigorous new safety requirements were introduced. The back to basics approach of the 1994 season (concerning technology and, more specifically, drivers’ aids) was quickly jettisoned. Subsequent seasons would once again embrace technology in the name of both safety and speed. Consequently, in the two decades since 1994, Formula One racing has become a substantially safer and faster sport. But, as Jules Bianchi’s recent accident in Japan reminds us, no matter how technologically advanced the cars may now be and no matter how skilled the athletes who drive them may be, risk cannot be eliminated completely. With this in mind, death, the unwelcome guest, returns to Formula One racing. Perhaps the iconic lost-generation writer F. Scott Fitzgerald was right when he ended The Great Gatsby by stating that, “So we beat on, boats against the current, borne back ceaselessly into the past.”

Oil: A High-Stakes Game of Chance

January 15, 2015 11:30 am
Oil: A High-Stakes Game of Chance

Popular wisdom has it that the only certainty in life is death and taxes. Until a few short months ago, most economists, politicians, political pundits and journalists would have added to that short list a perpetual increase in the price of oil. But, in today’s interconnected world, much can happen to a global commodity like oil within the period of a few months.

A generally lethargic global economy, a hedged bet by OPEC (the intergovernmental Organization of the Petroleum Exporting Countries) as well as an upswing in production of unconventional oil in America’s shale oil fields have all contributed to a significant oversupply of crude in the global market, resulting in a substantial decrease in the price of the world economy’s hydrocarbon lifeblood. The consequence: global oil prices have fallen by upwards of 50 per cent.

Rapidly descending oil prices have undoubtedly been a boon for motorists and should also serve as a form of stimulus for the global economy that is still recovering from the financial downturn that began in 2008. At the most basic level of pocketbook analysis, lower crude oil prices encourage greater consumption and therefore help to stimulate the global economy. Lower prices put extra income in consumers’ pockets, income that can be spent on purchases and services which might not otherwise be consumed. But the current sliding oil prices are also a boon for oil-dependent industries. For instance, lower prices increase the sales of automobiles although, to the dismay of many environmentalists, they typically spur the sale of less fuel-frugal vehicles. Lower oil prices reduce the operating costs of airlines and other transportation-centric industries, in a perfect world, should translate into lower consumer prices for the products being transported. However, as we all know, does not always happen.

TrafficOther beneficiaries of today’s lower oil prices are the countries requiring imported oil to subsist. Unfortunately countries, like the consumers who inhabit them, are more likely to spend rather than save the extra money that, in the past, would have been earmarked to cover transportation costs. To claim otherwise would be mere wishful thinking.

But, if the current slackening of global oil prices constitute a boon for individual consumers and oil importing countries, they are a bane for oil producing and exporting countries. Since Canada is a net energy exporter, this means that the slump in oil prices will directly affect the Canadian economy. While Canada’s oilsands will undoubtedly be negatively impacted by the sudden sharp decrease in the global price of oil, the jury still remains out on how adverse that impact will be. A continuing slide in prices could render numerous projected oilsands extraction projects economically untenable, thereby reducing the amount of capital investment in Canada’s energy sector. Furthermore, on a broader level, the decline in oil prices could increase Canada’s trade deficit and substantially reduce previously predicted budget surpluses based on a higher international price of oil. And here is where both OPEC and the American shale oil industries come back into the picture.

Typically, when global oil prices slide, OPEC makes the conscious decision to bolster the price by reducing the amount of crude that its members pump into the world’s oil market. Because of OPEC’s dominant position in the area of oil production, this has the effect of controlling fluctuations and ultimately stabilizing the global market price of oil. However, in an unforeseen turn of events, OPEC has not indicated that it will pursue its usual price-bolstering strategy. Instead, its member states will refrain from curtailing oil production in an effort to prop up the sliding price of oil. It appears OPEC will not put the brakes on the global slide in the price of crude.

Many economists are speculating that this uncharacteristic action is aimed at undercutting the United States’ burgeoning shale oil industry. In other words, the hand that OPEC appears to be playing is to let the global price slip below a point where most of America’s lucrative (and rapidly expanding) shale oil projects cannot break even with the cost of exploration, extraction, transportation, and refining. If the global price of oil is below the American shale oil industry’s break-even price, investment in American shale oil will dry up. Existing firms will have to take on more debt to cover their costs when the commodity itself is worth less with each passing day.

The trump card OPEC seems to be counting on is this: the lower global oil price could limit future investment in America’s shale oil deposits thereby making that industry uneconomic. Prodigal investors and consumers would go back to OPEC’s own traditional light sweet crude. But why take the risk?

With the advent of recent advancements in extraction engineering procedures, America’s oil production has risen to approximately 9 million barrels per day. That tally is uncomfortably close to Saudi Arabia’s—OPEC’s most lucrative and productive member—daily output. In fact, there is now only about a million barrels separating the American daily output from that of Saudi Arabia. The American oil industry is nipping at the heels of the longstanding reigning champion in oil production and that reigning champion is taking evasive action to curtail the growth of its new and vigorous competitor. Before OPEC and the American oil industry finally turn over their cards (and the price of oil eventually stabilizes), Canada’s oil industry could become collateral damage in what increasingly appears to be a high-stakes game of chance. But no matter what the outcome, until then, consumers will smile all the way to the pump.

Nissan Skyline GT-R: Automotive Weapon & Forbidden Fruit

November 20, 2014 1:00 pm
Nissan Skyline GT-R: Automotive Weapon & Forbidden Fruit

Nineteen eighty-nine was by all accounts a banner year, one of fundamental change.

In February of 1989, convoys of Soviet tanks, artillery and armored personnel carriers lumbered across the Afghan-Uzbek Bridge and back into the Uzbek Soviet Socialist Republic, thereby signaling the Soviets’ abandonment of their nearly decade-long failed military engagement with the Mujahideen in Afghanistan. In June, tanks and armored personnel carriers belonging to the People’s Liberation Army rolled out of Tiananmen Square and into the streets of Beijing as the communist Chinese government forcefully dispersed swarms of student protestors while implementing a state of martial law. Then, in November, the Berlin Wall fell. The symbol of the Cold War that served as the boundary separating communist East Berlin from capitalist West Berlin had lost its mandate. And in August 1989, fundamental change of another order came to the automotive industry as well. While “the weaponry” originated in the East, it most certainly was not spawned behind the Soviet “Iron Curtain.” Instead, it occurred in Japan.

It was during the dog days of summer that Nissan introduced its Skyline GT-R BNR32. Nissan designed the Skyline GT-R BNR32 to function as a sophisticated weapon on wheels. It was to be a weapon capable of doing battle with Group A racing’s iconic BMW M3 and the venerable Ford Sierra RS500 Cosworth without so much as breaking a sweat. On this account, the Skyline GT-R succeeded. Furthermore, it dominated Group A racing between 1990 and 1994, going on to win numerous European endurance races during that same time span.

Nissan R32 Skyline GT-R (#2)While the R32 GT-R was designed to compete on the world’s most challenging race tracks, regulations at the time required that the car be homologated first. In other words, in order for Nissan to race the R32 GT-R, it needed to produce and sell the car to the general public beforehand. And if the R32 GT-R were no stranger to the winner’s circle on countless professional circuits, it was on the streets that the car would truly earn its unofficial nickname of “Godzilla.” Overnight, it became the most sought after car among tuners and import enthusiasts around the globe. The technology hidden under its metallic skin prompted much of this praise.

It was powered by a twin turbo 2.6-litre Dual Overhead Camshaft (DOHC) inline six-cylinder engine code named RB26DETT. The engine produced a severely underrated 276 horsepower at 6,800 rpm and 260 foot pounds of torque at 4,400 rpm. While those numbers are anything but tantalizing today, they were impressive for 1989. For instance, in 1989, a typical Chevrolet Corvette had a massive V8 engine yet only developed 250 horsepower. In fact, Ferrari’s brand new V8 powered sports car for 1989, the Ferrari 348, only produced 300 horsepower—a mere twenty-four more than the less prestigious, but just as quick, Nissan.

Nissan’s RB26DETT engine was officially capped at 276 horsepower because of a gentleman’s agreement among Japanese automakers of the era whereby Japanese manufacturers were to refrain from producing cars for public consumption that developed more than 280 horsepower. In reality, the engine was built so robustly that tuners soon realized that (after some tinkering) the straight six engine in the R32 could reliably produce more than double the manufacturer’s horsepower without frying its internal components. When it came to horsepower, the sky was the limit for the Skyline GT-R. Consequently, after some tweaking in the hands of tuners, R32 GT-Rs on the street were capable of meeting—and often exceeding—the average 550 horsepower that the typical Group A race-prepped GT-Rs produced.

But horsepower is only one element of producing a tremendous sports car. Copious horsepower serves no purpose if it cannot be translated into traction. More often than not, high horsepower results in a loss of traction. In other words, it can cause an automobile to handle imprecisely because it overpowers its tires, suspension and transmission resulting in slow cornering in a best case scenario and wheel spin, fishtailing or loss of control in worst case scenarios.

Nissan RB26DETT EngineIn that light, in 1989, prior to the advent of the now ubiquitous computer-controlled traction control systems which turn even the most rambunctious tire-shredding sports cars into controllable automobiles, Nissan’s engineers concluded that producing potent but useable horsepower in a rear-wheel-drive sports car was not a realistic option. Enter a sophisticated all-wheel-drive system called ATTESA E-TS. Demystifying the automotive engineering lingo, Nissan’s ATTESA E-TS stands for Advance Total Traction Engineering System All Terrains Electronic Torque Split. The last two words are what gave the GT-R the upper hand when it came to translating power into traction that can be harnessed to out-handle their rear-wheel-drive competitors. In fact, many would even say that those last two words gave the R32 GT-R an advantage over other all-wheel-drive cars that preceded it.

Unlike most cars which are equipped with all-wheel-drive systems that split torque evenly between the front and rear wheels, under normal driving conditions all of the GT-R’s power is sent to the two rear wheels, thereby allowing the GT-R to function as a rear-wheel-drive car. However, the moment that its computer sensors detect slippage in handling, power is instantaneously transferred to the front wheels as well, thereby allowing the GT-R to gain the surefooted traction and stability that is the byproduct of all-wheel-drive. This balancing act allowed the GT-R to out-handle most conventional all-wheel-drive automobiles. It also made the GT-R much more lively on twisty roads since the back end could slide out like a rear-wheel-drive car but the computers would catch it and apply the requisite power to the front wheels, thus preventing a loss of control and ensuring the quickest exit from a corner or slide.

Handling and traction were such paramount concerns to the engineers at Nissan that they even equipped the Skyline GT-R with a rear-wheel-steering system known as Super HICAS. While Super HICAS was included with the intention of further sharpening already razor sharp cornering, the jury still remains out on its effectiveness. Regardless of the merits of Super HICAS rear-wheel-steering system, in its day, the Skyline GT-R was a technological tour de force.

The Skyline GT-R and its techno-weaponry were further refined in the two subsequent generations that succeeded the R32: the R33 released in 1995, and the R34 released in 1999. Sadly, by the end of 2002, the sun had set on Nissan’s road warrior. Production was halted at the end of the year and it would be a half-decade before the Skyline GT-R’s successor, an automobile known simply as the Nissan GT-R, would truly bring Nissan’s performance portfolio into the twenty-first century.

Nissan R32 Skyline GT-R CockpitWhile the R32, R33 and R34 Skyline GT-Rs are less powerful and slower than the more technologically advanced Nissan GT-R, the older cars have an extremely strong appeal for automotive aficionados. This is likely because the older Nissans struck a balance between man and machine which is arguably lost in the current Nissan GT-R. For instance, all of the Skyline GT-Rs came equipped with tight-shifting manual transmissions requiring a heightened sense of driver agency and skill. The current Nissan GT-R is available only with a semi-automatic transmission that requires less driver agency and skill to use. In other words, too much liberty has been granted to the machine in the current car.

The older GT-Rs also appeal to contemporary auto enthusiasts because of their price and street presence. Unlike the Nissan GT-R whose base price is $108,000, the older Skyline GT-Rs can be found for as little as a tenth of this sum. But there’s a catch. Since the R32, R33 and R34 Skyline GT-Rs were never sold domestically in North America, they are classified as grey market vehicles and must be fifteen years old before they can be legally imported and registered for use on Canada’s roads. Consequently, for North American auto enthusiasts, the Nissan Skyline GT-R was always an example of Japanese forbidden fruit. Happily, in 2014, that forbidden fruit has ripened and just may be worth a bite.

Requiem for a Clutch

July 31, 2014 5:06 pm
Requiem for a Clutch

A sea change in automotive design is upon us. The systemic shift in engineering has led to the regrettable fact that most automobiles today come equipped with just two pedals; the clutch, that all-important third pedal, is missing. And, much to the dismay of driving enthusiasts, the automatic has overtaken the manual transmission becoming the default transmission in the automotive world.

This usurpation did not occur overnight; it has taken decades. But in recent years there has been a dramatic acceleration because, in a frantic attempt to meet government-mandated fuel economy and emission requirements, automakers have developed automatic transmissions which are increasingly more fuel frugal than their three-pedaled counterparts. Worse yet, the unexpected consequence now means that fewer and fewer motorists can really appreciate the skill involved in driving.

The manual transmission allows the engine’s speed to be synchronized with the speed of the drive wheels. The clutch pedal allows the driver to manually connect and disconnect the engine and the transmission. The act of shifting gears requires the engine and the transmission to be momentarily disconnected while the driver slides the transmission into the appropriate gear for the engine’s speed. The lower gears enable the engine to turn much more quickly than the drive wheels, a process that delivers copious amounts of torque, which is essentially the twisting power required to move an automobile from either a standing or a rolling start. Higher gears are often utilized to increase vehicle fuel efficiency by way of keeping engine rpm (revolutions per minute) lower than would be possible should the automobile in question have fewer gears.

In the not so distant past, when engines typically had a greater number of cylinders and larger displacements than those found on the road today, fewer gears were needed for the vehicle to perform optimally since greater amounts of torque were available at lower rpm speeds. Consequently, until the late 1970s, it was not unusual for manual transmissions to be equipped with three or four gears. But today, when fuel economy is considerably more important to many than was the case in the past, automotive engineers have shed as many cylinders as possible and trimmed displacement as much as possible without unduly sacrificing performance. They have also added more gears. As a result, unlike the larger engines of the recent past, most of today’s engines produce lower amounts of torque throughout the gear range and therefore require a greater number of gears to deliver as much torque as could their predecessors.

Driving can be a relaxing and satisfying endeavor but driving a car with a manual gearbox requires both patience and practice. Intuition and immediate graPicture 1tification do not apply when driving stick. And the old axiom that practice makes perfect has never been more true. With time the novice driver can transition from a series of jerks and stalls to smoothly rowing through the car’s gears, but balance is still required when behind the wheel of any car using a manual transmission.

Unlike driving an automatic, driving manual requires you to achieve a balance between the precision of moving your left and your right foot with synchronicity while also using your right hand to shift gears. Hand-eye coordination — not to mention foot-eye coordination — is requisite. And that is what driving is really about. Needless to say, the balance can only be achieved through focus, a level of which is well beyond that required for an automatic. Driving a manual means the driver must constantly listen to the engine’s pitch and tone to decide when to either downshift or upshift for the engine to perform. But with an automatic, all that is involved is steering. You just point the car in the direction you want to go and nothing more.

On a subjective level, driving a manual is quite simply more stimulating and much more rewarding than driving automatic. There is no substitute for the driver satisfaction that is generated when the balance and timing discussed earlier are synchronized perfectly by way of a seamless shift. Most automotive enthusiasts would, I am sure, agree.

It should be noted that it is not just the run of the mill grocery-getters that are abandoning the manual transmission. The automotive performance brands are doing the same. Porsche no longer offers its iconic 911 Turbo with a manual transmission; Ferrari and Lamborghini have also shuttered production of automobiles with three pedals. And while Mercedes Benz continues to build some of the most powerful engines ever bolted into street legal automobiles, none come with a manual transmission. Audi, BMW, and the astronomically expensive British boutique brand known as Aston Martin are among the few luxury sports car brands that still offer manual gearboxes — and therefore still require a driver who really knows how to drive.

On this side of the Atlantic, many of the reincarnated rear wheel drive muscle cars sold by the brands formerly known as the “Big Three” (General Motors, Ford and Chrysler) still offer manual gearboxes as do some of their European inspired “hot hatches,” which are essentially front wheel drive hatchbacks with a high horsepower, four cylinder engine. But Japan’s small engine fitted, powerful sports cars which have been extremely popular with young driving enthusiasts and tuners for the past twenty years (like the Mitsubishi Lancer Evolution and the fabled Nissan Skyline GTR) have also been jettisoning the clutch pedal. Sadly, the few automobiles continuing to use a manual transmission are merely examples of the exception that proves the rule.

Today, in place of the traditional manual gearbox, sports car manufacturers have increasingly been producing sequential manual gearboxes which allow a driver to switch gears with “Formula One” like paddles located behind the car’s steering wheel. Flicking the paddle on the driver’s right switches the engine into a higher gear, while flicking the paddle on the left shifts it into a lower gear. But these sequential manual gearboxes do not have a clutch pedal that would require the driver to achieve the balance and connection with his automobile that was described earlier — that synchronicity which is required to drive an automobile equipped with a non-sequential manual gearbox or stick shift.

Furthermore, sequential manual gearboxes (paddle shifters or semi-automatic transmissions) can often be ignored completely. The driver can usually opt to leave a paddle shifter equipped automobile in fully automatic mode, thereby eliminating the already marginal effort required to flick up or down the gears of the vehicle. Sports car manufacturers claim that these paddle shifters can make faster gear changes than could even the most skilled driver in a stick shift car. And that is what is making the traditional manual transmission equipped sports car obsolete. While it may be true in terms of track times and performance figures, the balance, focus and skill required to drive a stick shift are also being made obsolete.

Yet momentum continues to build relentlessly for the automatic transmission. Much to the dismay of automobile aficionados, there may be a time in the future when knowing how to drive stick shift will be considered about as useful as knowing how to read and write in Latin. Hopefully that day will not come soon.

Want the Best Pizza in Ottawa? Head West to Stittsville & Kanata

June 12, 2014 2:20 pm
Want the Best Pizza in Ottawa? Head West to Stittsville & Kanata

Horace Greeley, the nineteenth-century politician, author and proprietor of the then extremely influential New York Tribune newspaper, famously offered this sage advice: “Go west, young man.” One could argue that Greeley’s reasoning could also apply to the quest for gourmet pizza. Why you may ask? Summer is upon us and with it comes the patio and deck season. When your appetite sharpens and when barbeque just will not do, there is really only one place to turn to for truly delicious pizza: Jo-Jo’s Pizzeria in the fast-growing town of Stittsville, a suburban enclave on the fringe of Kanata.

Jo-Jo’s Pizzeria has served Stittsville and its surrounding area for over thirty years daily producing nutritious and delicious freshly prepared foods. Throughout much of that time, it has been owned and operated by the Kassis family. This remains the case today, as the business is now owned and operated by twenty-nine year-old Zeyad Kassis and his three sisters: twenty-eight year-old Fatina, twenty-five year-old Natasha and twenty-three year-old Rebecka. Together they manage the day-to-day operation of this popular gourmet pizzeria.

jojo1Three years ago, in June 2011, the company transitioned to its current, more youthful, ownership and managerial structure. It was then that Zeyad, Fatina, Natasha, and Rebecka assumed their father’s ownership stake in the company and laid the foundation for expanding the menu and the physical presence of Jo-Jo’s Pizzeria well beyond Stittsville. A professionally trained chartered accountant, Zeyad feels strongly that, “We’re a family-operated business, so we serve [our customers] as the extension of our family.” This approach results in both delicious pizza and bountiful charitable work in Ottawa’s west end.

The old saying that “the whole is only as good as the sum of its parts” can be applied to pizza. With this reasoning in mind, it may come as no surprise that good pizza cannot be crafted from poor quality ingredients. This fact is clearly recognized at Jo-Jo’s Pizzeria where premium food products are prepared using nothing but first quality ingredients. Zeyad is quick to point out that, “we use the best ingredients that are available to us in all our products.” More specifically, he explains that, “our ingredients are always fresh ingredients. Our vegetables are always cut everyday. Our traditional style pizza dough is made fresh everyday.” While the standard operating procedure for many pizzerias is a reliance on factory prepared and processed ingredients, Jo-Jo’s bucks this trend. It even uses real home-cooked sliced bacon in its Caesar salads as well as its pizzas and specialty sandwiches.

Christmas Day is the one day a year that the pizzeria is closed. With that one exception, the chefs at Jo-Jo’s prepare their own homemade tabouli on a daily basis. And every other day, those same chefs make their homemade humus as well as the garlic dressing that adds a unique flavor to their beef donairs and chicken shawarmas. But in the competitive world of pizza, the foundation for excellent pizza begins with the pizza dough and the pizza sauce. Jo-Jo’s traditional pizza dough is made fresh every single day. As for the pizza sauce, quality remains the primary concern. Unlike the many pizzerias using sauces that are full of corn starch and preservatives, Jo-Jo’s Pizzeria uses only its signature pizza sauce that is preservative-free and is made from vine-ripened tomatoes fresh from California. But it isn’t just the daily replenishment of fresh ingredients used by Jo-Jo’s Pizzeria that results in scrumptious pizza. A large and evolving menu also helps capture a foodie’s interest.

Over the years, Jo-Jo’s has added many new pizza toppings to its menu but the ever popular Jo-Jo’s Special — a mixture of pepperoni, mushrooms, green peppers, olives, onions, tomatoes, and bacon — is a classic. And this is not to mention Jo-Jo’s extensive array of creative submarine sandwiches, finger foods, and their quality Lebanese foods. One such popular new item that was “brought forward” is Jo-Jo’s gluten-free crust. As Zeyad explains, when a customer orders “our gluten-free crust, what they’re getting is a new, fresh, gluten-free crust that’s made for us by a local mom and pop shop.” He goes on to say that it is what “differentiates us from our competition, which uses a vacuum-sealed product that was made who knows when.” Again, quality is key.

But service counts too. “Our competitive advantage would definitely be in our product and our service,” claims Zeyad, explaining that, “We try to offer exceptional products as well as exceptional service, from the person who answers the phone, to the individual who delivers the pizza to your door.” The longevity enjoyed by this family business requires no further explanation.

Yet there is more here than meets the eye. Jo-Jo’s is an important part of the fabric of society in Stittsville, providing the pizza that students enjoy on pizza days. In fact, as Zeyad notes, “We’re privileged to serve schools in Stittsville, not to mention Kanata and Almonte.”

Beyond this, Jo-Jo’s is actively involved in charitable work in Ottawa’s west end by way of sponsorship and fundraising events. Last year alone, Jo-Jo’s sponsored sixty-one local youth hockey and ringette teams and was the principal sponsor of two hockey tournaments in Ottawa’s Goulbourn region. And this year the numbers are growing.

The pizzeria is also a substantial contributor to the community in other ways. In addition to its sponsorships, Jo-Jo’s underpins many charity fundraising events in Stittsville by providing the food that fundraisers rely upon when holding their raffles, book sales and picnics. Zeyad sums up the situation by stating that, “Within the community, especially in Stittsville, you would be hard-pressed to find some sort of fundraising activity that we’re not part of.”

Clearly, Jo-Jo’s is not merely another takeout pizzeria. It offers clean and comfortable modern dining facilities at both the original location on Main Street Stittsville and the new Kanata location which opened last August. Both locations can serve large parties as well as cater for businesses and private events. Expansion in the west end of Ottawa is the name of the game for Jo-Jo’s Pizzeria. “We would like to expand the brand recognition that we earned in Stittsville to a broader portion of the City of Ottawa’s west end,” says Zeyad Kassis. As for the future, he notes that, “we’re going to expand into Kanata and then work our way through Ottawa, and then see where that takes us.”

For additional information visit:

Tragedy Touches Lac-Mégantic

July 16, 2013 10:20 am
Tragedy Touches Lac-Mégantic

Just over a week ago, the picturesque Quebec town of Lac-Mégantic entered the public lexicon for all the wrong reasons. On Saturday, July 6, at 1:14 a.m., an unmanned runaway freight train belonging to the Montreal, Maine & Atlantic (MMA) Railway Ltd. derailed in the heart of the town. But this was no ordinary derailment. The train was 1.5 kilometers long and consisted of 73 cars, all but one of which were carrying crude oil from North Dakota’s Bakken region bound for an oil refinery in New Brunswick. The runaway train entered the town at a speed of 101 km/h without any warning — almost 10 times the maximum safe speed of 10 km/h — and upon reaching the curve in the tracks behind the main business district of Lac-Mégantic, many of its tanker cars and their highly flammable cargo derailed. They piled up like a gigantic iron accordion, exploding into flames on impact and quickly turning the heart of Lac-Mégantic into an unimaginable inferno.

Almost 40 buildings were instantly destroyed and dozens of residents were killed, many of whom were likely vaporized by the extreme heat from the explosions and subsequent fires. To date, 33 bodies have been recovered from the destruction and at least 17 more individuals remain missing and are presumed dead.

The scene was set for tragedy when, around 11 p.m. on Friday, July 5, more than two hours before the accident, the ill-fated train’s engineer parked the train on the mainline for an overnight stopover in Nantes, some 10 kilometers up track and also uphill from Lac-Mégantic. At that time, the engineer’s shift for the night was over. He took a cab to a hotel in Lac-Mégantic to sleep as the train sat on the tracks awaiting his replacement who would guide it further along its way to the New Brunswick refinery. As is the standard (and required) operating procedure, one of the train’s locomotive engines sat idling overnight. However, shortly after the engineer’s departure, smoke from what turned out to be a minor fire caused by a ruptured fuel or oil line in the sole running locomotive caught the attention of nearby residents of Nantes. That fire was promptly extinguished by local firemen. In putting out the fire, the firemen may have turned off the power to the locomotive. This may have been the true starting point for the chain of events that would ultimately result in catastrophe because, in order to keep the entire train’s airbrake system functioning, at least one locomotive must be running at all times. Without that one engine running, there would be no power to provide the pressurized air that maintains the air braking system itself.

That said, it is important to note that the airbrake system is not the sole braking system used to prevent a parked train from moving. All trains still use manually-set handbrakes as well. Each car has a handbrake. And, when a train is left unattended, whether overnight or just momentarily, the train’s crew must apply numerous handbrakes. Herein likely lies the answer to the mystery as to how and why the train began its deadly roll downhill towards its unscheduled final destination in Lac-Mégantic.

The key question in discovering why the accident occurred will likely be the following: did the engineer apply a sufficient number of handbrakes to keep the train stationary on the track’s downhill grade of 1.2%? But before an answer to that question can be found, it must be noted that the phrase “sufficient number of handbrakes” can best be described as subjective. This is because there is no cast-in-stone rule which determines how many handbrakes on any given train could be considered sufficient. While many railroaders use the so-called “10% plus two rule,” which stipulates that a sufficient number of handbrakes to prevent a train from moving can be achieved by applying the handbrakes on 10% of the cars making up the train plus an additional two cars, there are no hard and fast rules on this subject. In other words, what would be deemed a sufficient number of handbrakes on any given train often differs from one railway to another and from one crew to another.

MMA Chairman Edward Burkhardt addresses the media in Lac-Megantic

MMA Chairman Edward Burkhardt addresses the media in Lac-Megantic (Photo: John Kennedy, The Montreal Gazette)

Shortly after the accident, MMA Chairman Edward Burkhardt stressed the fact that, as far as he knew, the engineer had indeed adequately secured the train for its overnight stop, despite the track’s downhill grade of 1.2%. However, Burkhardt has since claimed that a total of 11 handbrakes needed to be activated to prevent the train from starting its deadly descent into Lac-Mégantic, and that the accident itself indicates that the engineer failed to do this before leaving the train unattended overnight. Determining whether or not the engineer did set the required 11 handbrakes will not be an easy task, given the scope and scale of the destruction on the ground in Lac-Mégantic. However, the locomotive’s event data recorder should be able to help clarify one aspect of the handbrake mystery.

The event data recorder will allow investigators to match the locomotive engine’s mechanical and instrumental performance with the timing of events following the train’s being secured for the night and preceding the outbreak of the minor fuel line fire and the engine’s possible powering down by Nantes’ local firemen. The preservation of such information would allow investigators to judge whether or not the engineer took any action indicative of testing the train’s handbrakes prior to turning in for the night. After setting what is deemed to be the appropriate number of handbrakes on a train, engineers are required to test the effectiveness of their handbrakes by trying to start the train and move it either forwards or backwards. Should the train move forwards or backwards while the throttle is depressed, an insufficient number of handbrakes has been set. The engineer must apply more handbrakes and then try the test again. It is only when the train cannot move under power that the test has been passed and the train truly secured. The event data recorder should indicate whether or not the handbrakes (sufficient in number) were tested in this manner before the train’s throttle was placed in its idle position and before the engineer called it a night.

Quite aside from the root cause for the runaway train, the accident itself has drawn public attention to the resurgence of the practice of shipping oil by rail. Because the volume of oil waiting to be brought to market from Canada’s oilsands as well as from the vast shale oil fields in North Dakota and Montana far exceeds the existing pipeline capacity and because the costs (both financial and regulatory) associated with expanding existing pipeline capacity have increased substantially (as has the virulence of organized environmental opposition to such projects),  the volume of oil and petroleum products shipped by rail has increased markedly over the past few years. For instance, according to Statistics Canada, the number of Canadian railcars delivering crude oil or other petroleum products has doubled between April 2012 and April 2013. In April 2012, 7,194 railcars were used to deliver Canadian petroleum products. By April 2013, 14,217 railcars were used to deliver Canadian petroleum products to refineries and, ultimately, to market.

But it is when we look south of the border that we see more clearly the increasing importance of shipping oil by rail. In the United States, during the first half of this year alone, 355,933 carloads of oil were shipped by rail compared to just 5,358 carloads during the first half of 2009. Even so, with a combined Canadian and American oil production of some 11.26 million barrels per day, data reveals that less than 8% of North American oil was brought to market by rail during the first half of 2013.

Sadly, the tragic accident in Lac-Mégantic is now being harnessed by those on both the left and right of the political spectrum in an attempt to further their own political agendas. On the left, environmentalists are urging Canadians to re-examine and ultimately — but unrealistically — abandon fossil fuels not only because of their environmental impact but also their supposedly inherent risks to mankind as witnessed by the trail of destruction left behind in Lac-Mégantic. Also from left of the aisle, Thomas Mulcair, the Leader of the New Democratic Party (NDP) and the Leader of Her Majesty’s Loyal Opposition, has spuriously claimed that the accident could have been related to “public safety-related” spending cuts made by Stephen Harper’s Conservative government. On the right, advocates of the seemingly politically-toxic topic of pipeline expansion are claiming that the accident in Lac-Mégantic highlights the pressing need to greenlight present and future pipeline projects, given that shipping oil by pipeline generally results in fewer accidents than shipping oil by rail. However, data has shown that, while oil delivered by pipeline may indeed experience fewer accidents (and therefore spills) than oil delivered by rail, pipeline spills tend to release a greater volume of oil than rail-related spills.

Rhetoric aside, neither the tragic accident in Lac-Mégantic nor its probable regulatory fallout will fully satisfy the vested interests on the left or right of the political spectrum. While the derailment may marginally slow the expansion of shipping oil by rail and usher in the replacement of the widely used, but prone-to-rupture-upon-impact, Class 111A tank cars with safer and stronger tank cars constructed of thicker tank shells, the need for oil to produce everything from gasoline to plastic patio furniture will ensure that the oil trains keep rolling above ground and the pipelines keep flowing below ground.


Top Photo: Bell Media

Why You Should Never Turn Your Back on the Sea

July 11, 2013 5:33 pm
Why You Should Never Turn Your Back on the Sea

In our continuing series on the Canadian Coast Guard we look at some rescues and learn some valuable lessons about the power of the sea.

“Never turn your back on the ocean” is an old Hawaiian saying. It is sage advice for any mariner. Most sailors are aware that the ocean is a dangerous and unforgiving place. Yet, as Thomas E. Appleton writes in Usque Ad Mare: A History of the Canadian Coast Guard and Marine Services: “Seamen (of the past) were cynical about safety procedures which are nowadays considered appropriate. In fact, very few of them could swim.” It seems unthinkable today but that was the case.

There are countless stories of survival and battles with that mysterious and unpredictable force known as the sea. There is no other Canadian organization that knows this more than the Canadian Coast Guard (CCG).

On the morning of February 22, 2009, some 400 kilometres due east of St. John’s, the Spanish fishing trawler Monte Galineiro caught fire and began listing violently to her port  side. It soon became clear that the trawler could no longer stay afloat, leading the crew to send out a distress message and abandon ship. Most crew members left without the protection of an immersion suit, a lifejacket or, in one case, anything other than underwear — proof positive that the sea can catch you off guard.

Within 20 minutes of the first sign of trouble, the Monte Galineiro was swallowed by the North Atlantic. Yet luck would have it that the Canadian Coast Guard’s CCGS Leonard J. Cowley was only about two miles away. Its captain, Derek LeRiche, had been observing the Monte Galineiro, intending to board and inspect the fishing trawler when he heard its Mayday message. Ten minutes later, the Leonard J. Cowley was on the scene.

The CCG has provided search-and-rescue services since its inception in January 1962. Sometimes, luck combines with skill for optimal performance — as was the case with the Monte Galineiro — but at other times the sea has different plans.

Early on January 30, 1993, the Cape Aspy, a scallop trawler, sailed out of Lunenburg on a calm sea headed for the fertile fishing grounds of Georges Bank. The Cape Aspy would never reach its destination. According to the Transportation Safety Board of Canada report, the Cape Aspy sailed “in a southwesterly direction at a speed of 10 knots, a course which gradually diverged from the protection of the coast, giving an increase in the roughness of the seas.” To allow easy movement by her crew, the Cape Aspy sailed into these rougher seas with “many weathertight/watertight openings to the hull in the open or in the closed but unsecured position.” As the trawler sailed on, its increased speed, coupled with stronger wind and spray and temperatures of minus 20º, contributed to create a build-up of ice on its superstructure. The rough sea was positioning the increasingly top-heavy Cape Aspy for trouble.

Around 11 p.m., the Cape Aspy began to list to starboard as ferocious waves washed over its deck. Fifteen minutes later, “the vessel rolled heavily to starboard, partially returned to the upright and then rolled further to starboard; the list suddenly increased to about 45 degrees and the vessel appeared to be ‘settled by the head.’” Cape Aspy was dying.

As she filled with frigid water, all but two of her 16 crew members managed to climb into thermal immersion suits. When the order to abandon the listing ship was given, these crew members, minus the ship’s master, inched their way toward an inflatable life raft on the port side of Cape Aspy’s stern by slipping and sliding along the ice-covered wall of the wheelhouse.

As the life raft was inflating — a process taking only a few seconds — a powerful wave swept it and 15 crew members into the sea. “Two of the crew members managed to enter the life raft and they assisted others to board from the sea. The mate and three crew members could not reach the life raft, and the strong winds caused it to drift away. Reportedly, the master did not manage to leave the wheelhouse,” the inquiry found.

Although the Cape Aspy sent out distress messages, “they were not received by any shipboard or shore-based radio station, probably due to ice accretion on the Cape Aspy antenna.” In fact, as the inquiry report put it, “the only indication of distress was provided by the vessel’s Class I Emergency Position Indicating Radio Beacon (EPIRB) which floated free and activated automatically after the vessel sank.” Once the EPIRB was activated, CCG and commercial vessels together with search-and-rescue aircraft began the search. At 1:45 a.m., an aircraft homed into the EPIRB, sighting a life raft in the rough water 40 minutes later. At 3:42 a.m., surface vessels directed to the scene by aircraft plucked 10 of the Cape Aspy’s crew members out of the life raft. An hour or so later, another ship nearby scooped up an immersion-suit-clad survivor floating in the water. The bodies of three crew members were later recovered, but two men were never found.

Tragedy struck again five years later when, on January 16, 1998, a bulk carrier named the Flare encountered a violent winter storm 45 miles to the southwest of Saint Pierre and Miquelon. Facing gale-force winds, the Flare was being pounded by waves in excess of 16 metres. Carrying less than the required minimum ballast for an Atlantic crossing, the Flare was riding dangerously high on the water. Had the Flare been more heavily ballasted, she would have ridden lower and better absorbed the sea’s pounding.

Around midnight, the ocean began to get the better of the Flare. A deafening bang drowned out the howling wind and roaring waves as the hull began to flex more markedly. Yet, after investigation, the Flare did not appear to be in immediate danger. That appraisal of the situation changed four hours later.

At 4:30 a.m., a second bang echoed through the ship, followed by violent hull vibrations. This time, the Flare had broken in two. The entire crew was standing on the stern section as it suddenly listed to starboard. The men watched helplessly as the bow section floated away in the rough sea. They slid along the snow-and-ice-covered decks, making their way in total darkness to the Flare’s lifeboats. Unfortunately, the ship’s increasing starboard list prevented the starboard lifeboat from being launched. Repeated attempts to launch the port lifeboat were also fruitless. The crew’s last resort was to launch a life raft over the ship’s stern, but before the crew could climb in, the life raft was washed away.

Within 30 minutes of the Flare breaking in two, the stern section had disappeared beneath the surface along with all but six members of the crew. These six men clung to the capsized port lifeboat, which had broken free of its moorings as the Flare sank. Two of the initial survivors succumbed to exhaustion and the sea. When a search-and-rescue helicopter located the remaining four survivors six hours after they had plunged into the frigid water, the magnitude of the destruction was fully visible. Additional helicopters, fixed-wing aircraft and CCG surface vessels surveyed the Flare’s grave, a scene that included a lingering oil slick extending some 10 miles to the west and about three miles in a north-south direction, an empty lifeboat, two vacant life rafts and 15 oil-soaked bodies, as well as the bow section of the Flare which remained afloat for four days after its sundering.

Before the crew abandoned ship, a “hurried, indistinct and incomplete Mayday” was sent out on a very high frequency (VHF) radiotelephone. Yet, that distress message appeared to be from an unidentified vessel. It would take some time before the vessel in trouble would be identified as the Flare and the incomplete information could be analyzed.

CCG’s director-general of operations, Wade Spurrell, says that when it comes to search and rescue, many variables are involved, especially when ships don’t follow regulations or respect the force and power of the sea. Normally,   when a ship issues a distress call by radio, “that signal would be picked up by the CCG radio traffic stations and the information would be relayed to one of our rescue coordination centres. Once the call goes in and the information is interpreted, the coordinator determines what assets are available and then tasks a primary resource –usually a Coast Guard surface vessel or a search-and-rescue aircraft functioning in coordination with the Department of National Defence.

“The search-and-rescue system is built on partnerships,” Spurrell noted. “In an operational context, DND is the lead department. The Royal Canadian Air Force has the knowledge and experience with their air assets, while the Canadian Coast Guard brings our knowledge of the marine environment and expertise. The typical response time for a primary asset is no more than thirty minutes. If it’s more, we would have to investigate why it took longer than thirty minutes. Where it gets more complicated is when we have only a snippet of information [as was the situation with the Flare]. In those cases, the controller might decide to task aircraft and ships to determine the nature of the incident. Most of the advancements in reducing the gap between the time when a distress signal comes in and the time when search-and-rescue technicians arrive on the scene of a vessel in distress are technological.”

Some of these time-reducing, life-saving technological advancements include satellite positioning and more accurate satellite-based emergency-locating beacons; improvements in radar; infrared cameras which are used to search for heat signals to help locate missing sailors; night vision gear; advancements in computer-based programs to run search modeling; sophisticated drift buoys that can be dropped from a ship or aircraft to simulate the drifting pattern of a lifeboat or a raft; and great advances in rescue boats, personal floatation devices and immersion (survival) suits. Nevertheless, as the Hawaiians know, you should never underestimate the power of the ocean. n



Top Photo:

The Most Valuable Commodity in Health Care

3:24 pm
The Most Valuable Commodity in Health Care

In Oliver Stone’s 1987 film Wall Street, Gordon Gekko, the ruthless corporate raider played by Michael Douglas, explains that when picking stocks, “the most valuable commodity I know of is information.” This same mantra should apply when it comes to the serious business of making health care decisions — but too often that advice goes unrecognized by Canadians.

What then is the origin of the information used by Canadians in shaping their opinions — and making their choices — about health care? Generally speaking, there is no single, one-stop source that Canadians uniformly consult. Wendy Nicklin, the President and CEO of Accreditation Canada, an organization which functions at arm’s length from government, develops standards and assesses health care providers to help them to improve their performance, puts it this way:

“Canadians get their health care information from many different sources. Some of these include health care professionals, such as family physicians and nurses, as well as family members, friends, the media and the Internet.”

Jeffrey Simpson Picture

Jeffrey Simpson, The Globe & Mail‘s national affairs columnist & health care analyst

Jeffrey Simpson, The Globe and Mail’s national affairs columnist and longtime health care analyst, maintains that some sources have a greater presence than others in shaping Canadians’ thinking about health care. Simpson claims that, “Most information that Canadians get on the subject of health care comes from their physicians or other medical persons such as pharmacists, nurses or nurse practitioners.” Not surprisingly, there seems to be a consensus among health care providers and the general populace that physicians and other medical professionals are the most reputable sources of health care information. But with facilitated access through smartphones and tablets, the Internet is fast becoming an important source of health care information. And this trend is having an impact on patients and their relationship with their health care provider.

Nicklin sheds some light on how Canadians’ greater use of the Internet as a health care information search tool is affecting the time-honored patient-physician relationship. She explains that because of the easy accessibility to a wide array of information online, “Many Canadians go to their physician armed with a lot of information.” However, she points out that they tend to do this “without knowing whether the information is credible or applicable to their clinical situation.” Simpson has noticed a similar trend: “Some Canadians go on to the Internet and pull down information about whatever they think ails them and other information pursuant to what other medical personnel tell them.”

Nicklin also is quick to point out that: “Some members of the public are arriving [at their physicians’ office or other medical institutions] with some preconceived notions of what they’re dealing with, [and this] creates a reality where the public is coming in equipped with what they believe is accurate information or knowledge; [yet] that information may have actually led them down an incorrect path.” Simpson expands upon this by explaining that: “Some physicians and medical practitioners are cautious about having their patients search the Internet prior to having the physicians themselves conduct the appropriate medical tests and, therefore, make a traditional medical diagnosis.” This is because it can be difficult for the average Internet user to understand the complicated medical information on some websites and to differentiate between applicable information and that which is not.

In that light, like Simpson, Nicklin is adamant that it is the physician or health care practitioner who should decide if the information the patient brings is indeed relevant to their situation, and what the diagnosis is.

Information alone can be as misleading as it is helpful and can have dangerous consequences in the medical profession as Nicklin and Simpson point out. Nicklin explains that: “Misinformation can result in delay of treatment and can lead to unnecessary fear and anxiety in a patient.” Simpson says that physicians are aware of these dangers: “They are more aware of the fact that there is all this information out there about medical problems, diseases and remedies, and are aware that patients can be more demanding than in the past.” But patients’ demands are not always properly aligned with appropriate treatment. Increasingly, physicians are noting that an Internet-based self-diagnosis leads to a firm belief (on the part of the patient) that the illness was caused by a certain condition, while the physician knows from experience and rigorous training that this is not so. Simpson describes his solution to this problem by reasoning that physicians should conduct a proper, traditional, “comprehensive” diagnosis and then (and only then) the patient should go to the Internet to learn more about the results and possible treatment options that they would like to pursue with their health care provider.

At the end of the day, Nicklin captures the essence of the challenge associated with the unbridled use of the Internet as a source of information for Canadians’ health care decisions: “Human nature leads us to look for additional information to guide us and to clarify our thinking, but it’s important to recognize that the information may not be accurate or relevant.” Nicklin justifiably warns us all that: “Until an appropriate assessment is done by a health care practitioner, the information being sought from other sources may be as misleading as it is helpful.”


Top Photo: Projects Abroad (

Gatsby, the Roaring Twenties and the Road Ahead

May 30, 2013 10:51 am
Gatsby, the Roaring Twenties and the Road Ahead

Baz Luhrmann’s interpretation of The Great Gatsby opened the 2013 Cannes Film Festival, prompting a return to the collective consciousness of F. Scott Fitzgerald’s novel and the era in which it is set. The public’s interest has been heightened by the film’s pioneering a new frontier in movie marketing with venerable retailers like Brooks Brothers and Tiffany & Co. selling reproductions of Jay Gatsby’s wardrobe. As Luhrmann recently pointed out, more copies of Fitzgerald’s novel have been sold worldwide over the past two weeks than were sold between its initial publication in 1925 and Fitzgerald’s death in 1940.

The Great Gatsby is set in the summer of 1922 and alternates between the sprawling Prohibition era metropolis of New York City and suburban Long Island’s so-called Gold Coast, the summer playground of the day’s old and new money. The story functions as Fitzgerald’s commentary on the darker side of the social constructs of American society in the aftermath of the First World War — albeit cloaked in a narrative about the mysterious Jay Gatsby, a character often seen as embodying a combination of naïveté, entrepreneurial spirit and the constant quest to improve one’s self. But the broader moment in history in which the story is set is of equal importance to the narrative itself.

Jay Gatsby – that iconic character of American literature – is subtly played by Leonardo DiCaprio, who reveals the inner insecurities of a man who built his image, persona and wealth in the shadows of American society through bootlegging in an era of Prohibition and through his ties to America’s then-burgeoning world of organized crime. The lavish parties that he throws attract respected and notorious members of New York City society in the hope of creating business opportunities to launder the proceeds of crime.  While the jury still remains out on whether or not Gatsby is an organized crime kingpin or merely the front man for a chapter of the powerful criminal underworld, it cannot be denied that he spares no expense in trying to fit into the upper echelon of American society and win back the love of Daisy Buchanan, the archetypal embodiment of a wealthy 1920s trophy wife whom he had met five years earlier.

Despite Gatsby’s efforts, on a sweltering summer afternoon inside a lavish suite in New York’s Plaza Hotel, he eventually confronts the reality that all the beautiful shirts, flashy suits, exquisite cars, perfect manners and money in the world will never allow him to enter the elite circles of American society. Membership in that club comes only through bloodline and family heritage – not through newly acquired money. Daisy’s husband, Tom Buchanan, who is flawlessly portrayed by Australian actor Joel Edgerton, is quick to remind Gatsby of this unalterable fact.

Besides the distinction between old and new money in the storyline, Luhrmann’s film masterfully creates a contrast between wealth and poverty through his carefully developed settings which include the brightly lit, palatial Long Island estates and the dark industrial wasteland that is sandwiched between New York City and Long Island. It is that gloomy “valley of ashes” situated on the road tying the city to the island which is home to many of those who provide the coal energy needed to power the city and its transportation infrastructure. The slag heaps found there are the dirty underbelly of glittering wealth.

Slowly, the mystery of Gatsby’s identity is revealed. He is really James Gatz from North Dakota who grew up penniless and ran away from home to build – through a combination of luck, ingenuity and personal rigor – the wealth needed to support his ambitions and unflagging optimism in his own future. However, unlike the novel and its 1974 film adaptation, Luhrmann’s film does not provide the viewer with a more intimate picture of James Gatz’s transformation into Jay Gatsby, since Luhrmann does not include Jay’s father, Henry C. Gatz, in his film. In the novel and 1974 film adaptation with Robert Redford as Jay Gatsby, Henry sheds light on the humble origins of the mysterious and larger-than-life inhabitant of West Egg, Long Island. Reflecting the mood of America in the 1920s, Henry explains in detail how he knew that his son’s determination and optimism would lead to a successful life. But, more importantly, even though Gatsby was living under a fabricated identity, he remained linked to his father by way of correspondence and funding. The link between father and son, past and present, authentic and illusory is less evident in Luhrmann’s film.

Luhrmann also takes some poetic license in how the narrator, Nick Carraway (played by Tobey Maguire), tells the story of his neighbor Jay Gatsby and the turbulent summer of 1922. Carraway writes about it in a “patient’s journal” as psychiatric therapy for morbid alcoholism, insomnia, depression and other ailments in late 1929 and early 1930 — a time when the recent collapse of the American stock market signaled the end of the decade’s roaring economy. The party was over. Yet, it was during the Roaring Twenties that the benefits of the free market economy reached a wider cross-section of the American public than ever before — far surpassing the “Gilded Age” of American free-market capitalism at the close of the 19th century. Republican president Calvin Coolidge summed up the thinking of the era in a famous (yet often misquoted) speech to the Society of American Newspaper Editors on January 17, 1925. Coolidge said that: “After all, the chief business of the American people is business. They are profoundly concerned with producing, buying, selling, investing and prospering in the world. I am strongly of the opinion that the great majority of people will always find these are moving impulses of our life.” Gatsby would have been a firm believer in this mantra.

Twenties Traffic Jam (1)

(Model T Ford Club of America:

What then was the context in which Fitzgerald’s novel was written? During the 1920s, the American economy grew rapidly through increases in the productivity of its workforce, the consequential increase in products produced for consumption, the greater access to credit which allowed the purchase of said products by a broader section of the public, the development of modern consumer product advertising as we know it, and the accelerated use of technology for communication, business, mobility and entertainment. But there was another practical reason why America experienced such growth. While much of Europe faced the arduous task of rebuilding in the wake of the First World War, the fact that America had remained untouched by the carnage and economic destruction of The Great War no doubt played a substantial role in allowing its roaring economy to surpass the crippled economies of the Old World countries.

During that era, the volume of manufactured consumer goods increased by some 64% and the productivity output per individual worker increased by more than 40%. These gains in productivity and volume reduced the price of a vast majority of consumer products including, most notably, that of the automobile. When the automobile became affordable for average Americans, the seeds for America’s car culture were sown and the ground was laid for the major population shifts associated with expanded suburbanization. In 1920, two years after the end of the First World War, there were 10 million automobiles in America. Nine years later, there were more than 26 million automobiles on America’s roads — although few were as magnificent as Jay Gatsby’s yellow Rolls-Royce phaeton (a Duesenberg SJ by poetic license in Luhrmann’s film).

With the expansion in automobiles came an expansion in infrastructure. Roadbuilding boomed and the increased mobility offered by the automobile allowed residential construction to double. No longer was there a need to live near one’s workplace. A 20% increase in wages calculated in terms of purchasing power (or “real wages”) coupled with the increases in productivity and growth helped facilitate construction and automotive booms as well as those of many ancillary industries ranging from gasoline and coal to lumber, rubber, steel and concrete.

But it wasn’t just goods and services which experienced greater sales, profit and growth during the Roaring Twenties. Business in the financial sector was expanding at an unprecedented rate. Wall Street was enjoying an unparalleled bull market. Stocks were trading at record levels. Insider trading was still legal (under most circumstances) and credit was being extended to a larger segment of the population, thereby fuelling growth and consumption. Furthermore, the proliferation of communication technology (most notably the telephone) was raising the stakes of business. The information needed for trading was able to be transferred faster than ever before. The telephone would remain the primary tool for gathering information and monitoring the performance of markets until the Internet came onto the business scene some 70 years later.

However, trouble was just around the corner. As would be the case in 2008, access to large volumes of easy credit led many to make frivolous purchases, to speculate excessively on investments and to delay payment for their purchases. This behavior led to personal debt levels eclipsing personal income levels. In the fall of 1929, the stock market collapsed. By the spring of 1930, the United States of America had plunged into the most severe depression that the modern world had ever seen. Seven years after the lights went out for the last time at Jay Gatsby’s Long Island mansion at the end of the summer of 1922, America’s economy would descend into darkness only to rise again because of the industrial demand required to win the Second World War – a stellar example of capitalism’s creative destruction.

(Top Photo: Warner Bros. Pictures (

What’s Inside the Coast Guard’s Toolbox?

May 29, 2013 12:30 pm
What’s Inside the Coast Guard’s Toolbox?

There are many sides to the Canadian Coast Guard. To start with, there is a fleet of approximately 118 vessels, a figure which includes not only ocean-going vessels ranging from icebreakers to SAR lifeboats, but also almost two dozen helicopters which are used on a daily basis for many services and operations — perhaps the best-known of which is airborne naval Search and Rescue. The aquatic tools of the trade used by the Coast Guard come in all shapes and sizes. At one end of the spectrum are vessels like the CCGS Louis S. St-Laurent, the 119 metre (393 foot) long heavy Arctic icebreaker which is widely viewed as the Canadian Coast Guard’s flagship vessel given its size, its large helicopter hangar and landing pad, its sophisticated science laboratories and its five powerful engines capable of churning out a total of close to 40,000 horsepower. At the other end of the size spectrum are the four dozen SAR lifeboats, the rugged 40 to 60 foot powerboats which are often the first responders to any distress call on Canada’s oceans or the Great Lakes. Given the extremely rough and treacherous seas on which they must operate while conducting rescue missions, each of the SAR lifeboats is specially designed to withstand the very worst that Mother Nature can come up with. Constructed in a manner that allows it to be what the naval industry refers to as a “self-righting” vessel, when a SAR lifeboat is hit by a wave strong enough to flip it upside down, the boat does not remain capsized. It rolls over until it is right side up again and is therefore able to continue carrying out its rescue operation despite the turbulent water.

Between the two size extremes can be found medium icebreakers, high endurance multi-task vessels, trawlers and an array of specialty vehicles ranging from oceanographic and other science-specific vessels to off-shore patrol vessels. However, one of the most unique pieces of equipment in the Coast Guard’s toolbox is neither a ship nor a boat — at least in the traditional sense — but rather a hovercraft.

The Coast Guard currently has four hovercrafts in its fleet, each of which is used daily in one capacity or another. Michel Vermette, Deputy Commissioner for Vessel Procurement in the Canadian Coast Guard said that the hovercrafts are anything but novelty vessels. They provide very real services. He explained that, although they sit upon an air-cushion instead of a heavy steel hull, “In the St. Lawrence River, or other regions that have thin ice, they are effectively used for breaking ice.” The hovercrafts serve as efficient and effective thin icebreakers by “riding on top of the ice and using the hovercraft’s sheer weight to cause the ice to collapse.” In fact, Vermette noted that their very shallow draft allows the hovercrafts to “do a lot of buoy maintenance due to the fact that they can get in and out of shallow areas easily.”

arctic B

Michel Vermette, Deputy Commissioner for Vessel Procurement, Canadian Coast Guard

Time and tide wait for no one and the same can be said for machines. As Vermette says, “Many of our ships are over 30 years old.” While the Coast Guard may have taken delivery of new SAR lifeboats over the past decade and a half, the Deputy Commissioner is quick to point out that the larger vessels currently serving in the Canadian Coast Guard’s fleet are anything but new additions. As he puts it, “The reality is that we haven’t put into service a large vessel in more than a generation. The last new, purpose-built large vessel [that we acquired] was the CCGS Henry Larsen, which was brought into service in 1988.”

But the times are changing. Under the 2011 National Shipbuilding Procurement Strategy (NSPS) and with the 2012 Federal Budget, $5.2 billion will be allocated for the design and construction of new ships for the Canadian Coast Guard and the Royal Canadian Navy. Vancouver Shipyards Co. Ltd. has won the NSPS “Non-Combat” (Coast Guard) portion of the shipbuilding contracts, a contract that includes the construction of a new state-of-the-art heavy polar icebreaker named the CCGS John G. Diefenbaker to replace the aforementioned CCGS Louis S. St-Laurent, as well as one off-shore oceanographic science vessel and three offshore fisheries science vessels for the Coast Guard. John Shaw, Vice-President of Government Relations and Business Development at Seaspan Shipyards, the parent company to Vancouver Shipyards Co. Ltd. which is tasked with building the new Coast Guard vessels explains the process.

The first vessels to be built will be the three off-shore fisheries vessels.  “We’re still going through the design process for those initial vessels” but once the design stage is complete, “We will approach the building process by staggering the start times for each vessel but, [as things get rolling,] we will actually build them simultaneously.” Getting to that stage isn’t easy. Shaw pointed out that “Building the first ship will be a little bit slower than the subsequent ships because we need to bring our new ship-building facilities up to speed to [facilitate] the construction [of all of the ships on order].” Yet, he was optimistic that “of course, once we have started production of any of these vessels, the timelines for construction will accelerate.”

Many of the Coast Guard’s nearly two dozen helicopters are not exactly spring chickens either. In fact, Canada operates the oldest fleet of MBB 105 light helicopters in the world. According to Deputy Commissioner Michel Vermette, the Coast Guard is in the process of “requesting to bid on replacing many [of our helicopters] with up to 24 new light and medium helicopters.” He noted that, in theory, replacing the Coast Guard’s helicopters with newer products is a less complicated task compared with constructing purpose-built ships which are designed and built from the ground up in specialized facilities. As Vermette puts it, “in this case we’re buying commercial off-the-shelf equipment.” It remains to be seen if that process will be as straightforward as it is hoped.


Here’s the Coast Guard at a glance.  

The Coast Guard:

• Provides resources and support for maritime-based Search and Rescue (SAR) operations which results in some 2,000 lives being saved annually;

• Manages innumerable navigational aids which help vessels stay on course;

• Monitors naval traffic;

• Provides essential information to mariners;

• Conducts aquatic scientific research;

• Responds to naval-based environmental emergencies; and, of course,

• Provides ice-breaking services

From Coast-to-Coast-to-Coast

April 9, 2013 12:04 pm
From Coast-to-Coast-to-Coast

Canada is a maritime country which generates a substantial amount of its GDP from exporting natural resources in today’s increasingly interconnected global economy. Its waterways serve as the maritime transportation network that contributes to the health, vitality and prosperity of the Canadian economy. Without safely navigable oceans and inland waterways, Canada’s position on the world stage would be compromised. And since Canada is home to one of the longest coastlines of any country in the world, the stakes could not be higher.

The responsibility of keeping Canada’s waterways navigable, as well as ensuring the safety and well-being of the crews and vessels that use those waterways, rests with the Canadian Coast Guard (CCG), which celebrated its 50th anniversary in 2012. The birth of the modern Canadian Coast Guard occurred on January 26, 1962, when Prime Minister John Diefenbaker established the CCG and outlined its three main responsibilities: “First, offering dedicated search-and-rescue services; second, maintaining Arctic sovereignty; third, responding to technical advances and increasing vessel traffic.” Today, 51 years later, that mandate remains unchanged.

The CCG’s roots extend back to the late 18th and early 19th centuries. The waterways in the Dominion of Canada were extremely treacherous. There were no safety standards to which ships had to be built and navigational aids were rudimentary at best, if not absent altogether. By the time the British North America Act of 1867 (now the Constitution Act, 1867) was ratified and the Dominion of Canada came into being, the waterways in the new Confederation were becoming more crowded than ever before. At the same time, the pace of shipbuilding worldwide was accelerating and many recently constructed commercial vessels were using relatively new (and substantially faster) steam-powered propulsion, as opposed to the long-standing naval reliance on the wind. With these developments, the number of passengers traveling by sea was increasing annually. The combination of these factors meant that the risks of — and consequences from — marine disasters were on the rise.

In 1867, amidst this unregulated and fast-paced new marine environment, the federal government inherited a number of what the modern-day CCG calls “elements of marine infrastructure – navigational aid systems; life-saving stations; canals and waterways; regulating organizations and enforcement vessels; and supporting shore infrastructure.” These elements of infrastructure were under the purview of the Department of Marine and Fisheries established in 1868. In 1930, the Department of Marine and Fisheries was broken up into two distinct ministries. In 1936, marine-based responsibilities and infrastructure came under the control of the Department of Transportation (DOT) — a move that brought with it the now familiar CCG duties of icebreaking and the maintenance of marine navigational aids.

The Canadian Coast Guard is the principal civilian maritime operational arm of the Government of Canada. Its activities and services are varied and complex, providing resources and support to the maritime element of Search and Rescue (SAR) operations and saving some 2,000 lives every year. It manages innumerable navigational aids to help vessels stay on course and is instrumental in the monitoring of naval traffic. It provides essential information to mariners and is involved in aquatic scientific research. It responds to naval-based environmental emergencies like oil spills. The CCG also provides ice-breaking services, ensuring that Canadian shipping channels are ice-free.”

These activities continue to be carried out on a daily basis in the three separate CCG regions of operation which were consolidated in October from the five created shortly after the inception of the CCG. The previous long-standing five regions of operation (the Pacific, Central & Arctic, Quebec, the Maritimes, and Newfoundland and Labrador) were streamlined into three larger zones of operation known as the Western region, the Central and Arctic region, and the Atlantic region.

Jody Thomas is Deputy Commissioner of Operations for the Canadian Coast Guard.

Jody Thomas, Deputy Commissioner of Operations for the CCG, said this decision was made “as part of the Canadian Coast Guard’s strategic and operating review process.” Thomas explained that simultaneously reducing the number of operating zones while expanding their size would allow the CCG to function more efficiently and effectively, thereby “reducing costs to Canadians without reducing services to Canadians.”

When the CCG was founded in 1962, it commissioned over 40 vessels ranging from search-and-rescue cutters to large icebreakers. Since then, its fleet has expanded to 119 vessels and 22 operational helicopters. It also has a staff of some 4,500 employees, about half of whom work on land. Like their sea-based colleagues, the land-based CCG employees often perform essential services that are required to help the CCG carry out its mandate of promoting accessibility to, and safety on, our nation’s waterways. Some of the essential services include the maintenance of marine communications and navigation, electronic naval technological development and engineering services.

Whether operating on land or sea, the CCG comes under the umbrella of the Department of Fisheries and Oceans (DFO) where it provides material and human resources for the Search and Rescue Program (SAR) in matters involving federal jurisdiction and responsibility. This is perhaps the best-known function of the CCG.

Stay tuned to our series on the Canadian Coast Guard. The next issue will pick up where this brief historical overview leaves off, with a more detailed discussion and analysis of the essential services provided by the CCG, a behind-the-scenes look at the vessels, vehicles and technology that allow the CCG to deliver its services. There will be a few surprises, that will appeal to the seasoned naval historian and the casual reader alike.

Canadian Coast Guard Facts

To the Rescue

In early June 2010, the CCG contributed to the Gulf of Mexico oil spill relief efforts by sending 3,000 metres of off-shore boom to Louisiana. Several Coast Guard regions joined forces to make this offer of assistance possible, all while maintaining a reasonable response capacity in Canada.

 Best Coast Guard College in the World

The Canadian Coast Guard College (CCGC) has a stellar reputation as one of the best Coast Guard colleges in the world. It has provided a free, world-class marine education to thousands of Canadians since its creation, in 1964. Located in Sydney, Nova Scotia, the CCGC is the place to go for aspiring ship’s officers. Upon completing their studies, graduates are guaranteed a job on a CCG vessel.

Keeping Canada Safe

The Marine Communications and Traffic Services (MCTS) of the CCG are a year round, 24/7 operation. Traffic Services personnel watch the waters and listen to radio transmissions to keep Canadian waters safe, secure and navigable.

In the Line of Fire: Long-Term Disability Insurance Coverage

11:58 am
In the Line of Fire: Long-Term Disability Insurance Coverage

Few things in life are guaranteed, including long-term disability (LTD) insurance plans. While long-term disability protection may be offered through an individual’s employer, not all plans are created equal.

Basically, there are two types of LTD. The first is the Insured Benefit LTD where the plan sponsor (the employer) pays a premium to an insurance company which then covers employees. Thus, it is the insurance company itself, and not the employer, assuming the financial risks involved. In this capacity, insurers are required to set up reserves against future payments.  In other words, when an individual goes on long-term disability, the insurer has to set aside enough money up front to cover the expected payments for that individual. So, even if the employer sponsoring the plan does go bankrupt, the coverage and benefits that the employee receives will continue. In addition, insurers in Canada are subject to a stringent regulatory regime requiring that reserves are held separate from the general funds of the insurer and that they hold an additional capital cushion over and above their other liabilities. At the end of the day, employees on long-term disability with fully-insured benefits can be assured that their LTD payments will continue for however long they remain disabled and unable to work.

The second type of LTD insurance plan is the Uninsured Benefit Plan or Administrative Service Only (ASO) plan. With an ASO, it is the employer who pays all the benefits to employees and a third-party administrator simply helps to manage the plan. Under such an arrangement, the third-party company focuses primarily on examining claims and administering payments on behalf of the plan sponsor. More often than not, uninsured LTD plans function as a “pay-as-you-go” plan. The Canadian Life and Health Insurance Association (CLHIA), which represents Canada’s life and health insurers, notes that these “pay-as-you-go” plans “rely on the plan sponsor being able to continue to generate adequate cash flow each year over the lifetime of the plan and to pay benefits for the duration of the benefit period.”

With an uninsured LTD, the plan sponsor is not required to establish a reserve fund against long-term disability payments. In that case, should the employer enter bankruptcy, there likely would not be adequate funds available to cover the future long-term disability payments of employees. It is not unknown for employees to face a complete loss of coverage in such circumstances.

While the vast majority of employers who offer disability benefits do so on an insured basis, those who do it on an ASO basis are generally large employers.

CLHIA President Frank Swedlove explains that although “situations [where employers go bankrupt] don’t arise that often, when they do, they tend to arise with large national companies and affect a lot of people.” One high-profile instance was the evaporation of the LTD benefits for former Eaton’s employees when the iconic Canadian retailer went bankrupt in 1999. Another that hit the Ottawa region hard was the recent bankruptcy of Nortel.

Recognizing these challenges, Canada’s insurance industry has examined many proposals to ensure the continued delivery of LTD benefits in the event of a plan sponsor’s financial collapse. As Swedlove notes, “the industry believes that the most effective solution is to ensure that all long-term disability plans in Canada are offered on a fully insured basis only.” The federal government has taken note of the risks associated with uninsured LTD plans in light of Nortel’s bankruptcy. The Jobs, Growth and Long-Term Prosperity Act has been used to amend the Canada Labour Code, requiring federally-regulated private-sector employers who provide LTD insurance benefits to use insured rather than uninsured plans. Swedlove believes this is a good starting point, but that more still needs to be done to cover all Canadians because “the changes to the Canada Labour Code only affect companies that are under federal jurisdiction, but the vast majority of companies here in Canada are under provincial jurisdiction.” To bring all the players to the table to create a better system to protect all employees would ultimately benefit everyone.


The Winds of Change and Canada's Health Care System

11:53 am
The Winds of Change and Canada's Health Care System

The 2011 Census of Canada revealed that continued immigration is changing the face of our country. Not only is Canadian society increasingly multicultural, we are becoming one of the fastest-growing populations in the G8.

Wendy Nicklin is the President and CEO of Accreditation Canada, a not-for-profit organization that develops and implements the standards that enable a health-care provider to assess and improve performance. Nicklin says that, “Canada’s changing demographics have been a reality for more than the past few decades, but it has been escalating over the past decade or so.

A growing Canadian population driven by large numbers of new Canadians with diverse ethnic and cultural backgrounds has important implications for the delivery of health-care services, as well as for the health-care system itself. In particular, it requires Canadian physicians and other medical practitioners to become more conscious of the cultural makeup of the patients they treat. Nicklin added that, “understanding different cultural values is very important when it comes to providing quality care for Canadians.”

A September 1996 article in the Canadian Medical Association Journal, written by Janice Hamilton, (Multicultural Health Care Requires Adjustments by Doctors and Patients) affirms what Nicklin is talking about. “Efforts to provide culturally-appropriate health care are being made in hospitals, clinics and physicians’ offices across the country.”

More recently, Nicklin noted that health-care organizations “regularly scan [for] the characteristics of the population they serve in an attempt to improve the services themselves and also the delivery.” However, almost 17 years after Hamilton’s article was published, Nicklin maintains that more remains to be done when it comes to building awareness about the diverse cultural makeup of patients in order to deliver more efficient, effective and affordable care.

One of the key areas that can be improved upon is communication between health-care practitioners and patients. It is an important step in creating an improved health-care system for all Canadians, regardless of their cultural background. However, it is not always easy to achieve. This is especially true if there is a language barrier between the health-care provider and the patient.

As Hamilton pointed out, “cross-cultural communication can be difficult for both doctor and patient.” She stressed that “…if the patient understands and agrees with the treatment, and the physician understands the patient’s views and ensures that the treatment is appropriate, the outcome will be better.” Of course, when a growing percentage of patients have neither English nor French as their mother tongue, this mutual understanding may not occur. Not surprisingly, according to Hamilton, “sometimes language difficulties cause mis-understandings or misdiagnosis.” Today, according to Nicklin, Canada’s health-care practitioners need to better understand that “the languages that your health-care workers speak are extremely important if you want to have a health-care system that is effective and responsive to Canada’s evolving population.” In other words, to better communicate with your patients, you need to have a greater number of health-care providers who are able to speak more than one or the other of Canada’s official languages.

Yet it is also important to remember that cultures vary not only in terms of language but also in terms of perception and attitude about the role of the health-care system and about the doctors and hospitals that function as the system’s visible representation. Years ago, Hamilton shed light on two of the most common types of cultural variations that can be a factor in hampering the effective delivery of health care to all Canadians in an increasingly multicultural Canada. First, “there is a cultural variation in attitudes towards physicians.” For instance, individuals of certain backgrounds may disagree with, or refuse to accept, a diagnosis delivered using the Canadian medical system’s standard, empirical evidence-based, medical procedures and testing. Instead they may insist upon the delivery of a diagnosis — and then eventually a treatment — which uses the diagnostic methods familiar to them from their country of origin.

The second cultural perception issue relates to hospitals. As Hamilton noted, “the hospital may also have a different role in other countries.” For example, “some parents become frightened when a doctor wants to admit a child to the [hospital] for tests or observation.” It often turns out that, in the country of origin, hospitalization means the patient is at death’s door.

Wendy Nicklin

Nicklin, on the other hand, expresses how important it is for the Canadian health-care system and its practitioners to recognize that “certain diseases or ailments have more predominance in different parts of the world.” What this means is that our health-care system must conduct the research and allocate the resources necessary to treat those diseases, ailments or disorders which now may be more common in the changing demographic of our country.

Essentially, as Nicklin explains, medical practitioners and researchers have to realize that “there are many different determinants of someone’s health and they’re important to focus on.” Income, social status, education, literacy, social environment, childhood development, gender and nutrition are just a few of the determinants affecting an individual’s health and, therefore, both the volume and cost of the care required on a lifelong basis. As Canada looks to improve its health-care system, increasing cultural sensitivity and communication will only increase efficiencies.

Matthew Stapley: An Insider’s Perspective on the Psychic Profession

March 11, 2013 11:19 am
Matthew Stapley: An Insider’s Perspective on the Psychic Profession

Eighteenth-century French philosopher Julien Offray de La Mettrie claimed that “everyone is born with psychic abilities; it’s just a matter of learning how to access them.” Ottawa-based psychic-medium Matthew Stapley would tend to agree. In an exclusive interview with Ottawa Life Magazine, Stapley was quick to point out that the psychic’s ability to perceive is not limited to psychics alone; it can be learned by those from all backgrounds, whether spiritual or not.

As Stapley puts it: “All you really have to do is figure out how your body, mind and spirit work together. And then, if you can do that, you can perceive things.” Doubtless, that is easier said than done, but Stapley sheds light on the inner workings of the psychic profession and deals with some misconceptions that surround it.

Matthew Stapley is the director of TransdimensionalBeing Associates Inc., an organization that conducts psychic readings and healing services in private, group or event settings. He is also the host of Rogers Television’s unique new weekly show, Psychic Insights with Matthew Stapley. The program’s format combines interviews with psychic practitioners and real-time viewer interaction. Viewers are able to call in and ask Stapley and his guests questions about the unknown.

Stapley first became aware of his psychic abilities at a very young age when his father, who was traveling in Thailand at the time, telephoned his family in Carleton Place. Eight year-old Matthew was able to describe in detail the room in which his father was sitting some 13,000 kilometers away. Stapley explained that his accurate, sight unseen, description of his father’s lodging on the other side of the world is an example of a psychic skill that the profession calls “remote viewing” or “the ability to see and describe things and places when you’re not immediately present.”

Stapley’s line of work is that of a psychic-medium — but what exactly does that mean? According to Stapley, psychic-mediums “perceive things about people, about the spirit energies that are around them.

“Most of what I do is spirit communication,” he says. Since any discussion of “spirit energies” can quickly go beyond esoteric for those unfamiliar with the psychic profession, Stapley explains that, when he and his fellow psychic-mediums refer to “spirit” and “spirit energies,” what they mean is that “spirit can be either a specific spirit that the person [with whom the psychic-medium is conducting a reading] knew or, alternatively, spirit can be the energy of the spirit world in general around them.”

Expanding upon this guide to the work of a psychic-medium, Stapley characterizes his work, and that of others like himself, as a “modality of healing.” As he sees it, psychic-mediums allow people to obtain closure, particularly since there are few places where people can find closure with someone who has passed away. It is the healing modality which “provides a place in people’s lives where they can communicate with the departed again.”

Stapley stresses the importance of understanding the different types of psychic practitioners (and the services they offer) to ensure that the needs of potential clients are met. In addition to psychic-mediums like Stapley, there are two other types of psychic practitioners, each sharing a common yet distinct skill set. The first are spiritual guides. Like psychic-mediums, spiritual guides tap into the spirit energy of their client. However, as Stapley explains, unlike psychic-mediums, spiritual guides channel the spirits in the client’s aura to help guide the client through life. Stapley explains that spiritual guides “guide people on their path” and act as an “advisor” to the client. The other type of psychic practitioner is called a healer. Healers provide “energetic healing facilitation.” That is, healers “hold space for a spirit to come in and do the work it needs to do” to help mend a client’s physical, neurological or psychological ailments, common examples of which may be arthritis or depression.

Beyond these three categories of psychic practitioners, other psychic abilities may be used, depending on the type of practitioner a psychic might be. Stapley notes that, within the psychic profession, these abilities are referred to as the “clair gifts” which include “clairvoyance (clear seeing); clairaudience (clear hearing); clairsentience (clear feeling); and claircognizance (clear knowing). These are the psychic abilities that enable the psychic practitioner to provide a service, which often entails reading their clients’ future. Stapley considers claircognizance to be the most challenging of the clair gifts, based upon the baggage that it places on the psychic who possesses that particular ability. For instance, a psychic possessing claircognizance “could just meet you and automatically know things about you.” Stapley notes that, for psychics with claircognizance, “instead of perceiving it [information about their client or anyone else they come across], they just know it.” He maintains that this is the most challenging of the clair gifts, since a psychic who possesses it can often be overwhelmed by a constant bombardment of information about strangers they encounter, whether client or not. Consequently, Stapley is quite satisfied to possess the two gifts of clairvoyance and clairaudience.

Many people automatically associate a psychic with the reading of tarot cards, but Stapley repeatedly uses his ability to conduct remote viewing and spirit communication without the assistance of tarot cards. “Cards are really good. They’re helpful to confirm things but, when I actually do readings with cards, I find it limiting because you’re stuck with what is on the cards.” He prefers a cardless approach and justifies this by explaining that “I feel like, when I don’t use cards, I just listen to what I’m hearing and seeing in somebody’s aura. I find that to be a lot more accurate as well.” And when it comes to ethics, Stapley understands the importance of being honest with clients. “If a reading is not going where I think it should be, then I will hand it off to another person who is better suited to help my client.”

In the mind of the general public, many enduring misconceptions surround the psychic profession. All types of psychics — and not just psychic-mediums like himself — must face this situation. The most common misconceptions originate with skeptics, who see the psychic profession as contrived and insincere, if not totally dishonest. Should skeptics participate in a reading, they may do so with their minds closed, expecting the psychic to be overwhelmingly wrong about just about everything. Stapley has had to deal with skeptics seeking to “test” the psychic-medium on numerous occasions and explains that “I’ve had skeptics want to test me, but I always say ‘no’ because I believe there’s nothing objective they can test with. If they want to find something wrong in what I say, they’ll find it, and if they don’t, they won’t.”

Those who do not question the legitimacy of the psychic profession also play a role in perpetuating misconceptions. Stapley explains that: “People who believe in this expect us to be right about everything one hundred per cent of the time — which is impossible. Even in a reading about things like the person’s past, we can be totally wrong. And that’s just part of the job.” This human element of the psychic profession often escapes notice because the media rarely reveals the shortcomings of high-profile psychics, which bothers Stapley who insists that “the media needs to show psychics being inaccurate.” He claims that to do so is “extremely important, especially from an outsider perspective.

“There are times when psychic mediums are bang on in their readings, but there are also times when they’re not. I think that to help both sides — the skeptical and the non-skeptical — the media need[s] to show both sides.” To do so would help contextualize the reality as well as the limits of the psychic profession.

Another common misconception is that law enforcement agencies will often use psychic-mediums to help solve criminal investigations. Stapley claims that while this practice is more common in the United States, it does not often occur in Canada since “it’s hard to use in court” if the investigation to which a psychic-medium contributed does eventually go to trial. That said, Stapley admits that “I’ve worked on a few cases helping to find animals and people but never for the police, only for the families involved.” He adds that, in Canada, if a psychic-medium is involved in an investigation, be it criminal or not, “most often the families involved will hire a psychic-medium on their own, but the police don’t tend to.”

Aside from the misconceptions perpetuated by skeptics and believers alike, Stapley notes that, in the eyes of many, there remains a stigma of illegitimacy attached to the psychic profession. But while media coverage of psychics may well perpetuate the misconception among believers that psychic-mediums are never inaccurate, Stapley notes that the media is also playing a role in lessening the stigma attached to the profession. Since there are more television and radio programs about the psychic profession today, the result is that the profession is becoming more accepted. Nonetheless, although “prior to the last ten to fifteen years, psychics were more marginalized as professionals and people would come to see their psychics in secret; even today, psychic-mediums are still often the last stop after the psychiatrist’s office.” Yet this need not be the case because, like the 18th century philosopher, Stapley is convinced that: “All people, and not just exclusively those in the psychic profession, have the ability to perceive things. All you really have to do is figure out how your body, mind and spirit work together.”


Psychic Insights with Matthew Stapley airs weekly on Rogers Television Cable 22 on Monday at 4pm.




America’s Fiscal Albatross

February 11, 2013 1:35 pm
America’s Fiscal Albatross

The inaugural celebrations are over and a new Congress has been sworn in. Washington, D.C. is returning to business as usual with the status quo being largely upheld. The White House and the Senate remain in the hands of the Democrats and the House of Representatives in the hands of the Republicans — and the 21st century challenges facing lawmakers remain unaddressed.

Perhaps the most significant of these challenges is finding the way to accelerate America’s anemic economic recovery while shoring up its fiscal house to avoid a decline from world superpower to the world’s best-funded banana republic. Politics in Washington remain polarized. The president’s second inaugural address has been viewed by many liberals as a new manifesto for progressive liberalism in the 21st century. Conversely, conservatives see it as a divisive and combative speech that tenaciously steers America significantly to the left by proposing to rewrite its position on policy issues which would drastically increase the size and scope of government, culminating in a more European America. Not to be overlooked is the fact that there is no guarantee that America’s anticipated economic recovery will occur without addressing its fiscal challenges in a bipartisan manner.

Reality once again deflated presidential rhetoric when the American economy took another unexpected turn for the worse. Recently released data revealed that, in the fourth quarter of 2012, the economy contracted at a rate of 0.1% which, according to the United States Commerce Department, is the worst economic report card received since the onset of the 2008-2009 financial crisis and the first quarterly contraction since the turbulent fall of 2009. To make matters worse, the release of January’s job numbers by the U.S. Labor Department revealed that the American economy created only 157,000 jobs compared to the nearly 200,000 jobs created in December 2012.

The U.S. Capitol Building (Credit: Georgia Tech)

While it may be true that the decrease reflected the end of the cyclical (and temporary) holiday retail hiring season, winter storms that curtailed consumer spending, lower levels of inventory investment and the elimination of some 9,000 government jobs, the net result has been an increase in the U.S. unemployment rate from 7.8% to 7.9%. This is bad news for any president embarking upon a second term, especially a president whose second-term agenda (at least according to his inaugural address) appears to focus on the pursuit of contentious “culture war” issues rather than reinvigorating the American economy.

But with more than $16 trillion in debt, continuing high unemployment and the probability that the Obama administration will impose higher tax rates, there is an increasing likelihood that the United States will become less appealing for business investors and entrepreneurs. Furthermore, research projects undertaken by the United States Government Accountability Office (GAO) and the Congressional Budget Office (CBO) — nonpartisan agencies which investigate how the federal government spends taxpayer dollars — have identified that the best-case scenario for the American economy is for it to maintain (not reduce) today’s debt-to-GDP ratio of 72% over the course of the next decade. However, looking beyond 2022, America’s albatross of federal debt will become more unsustainable and a much greater threat because of increased spending on existing entitlement programs — most notably Medicare and Medicaid — due to aging demographics and declining birth rates. Optimistic scenarios have forecast America’s debt-to-GDP ratio at an estimated 160% by mid-century while pessimistic scenarios estimate that figure at 200%. To put this into perspective, Greece’s debt-to-GDP ratio is now closing in on 200%.

The White House (Credit: MSNBC)

While the United States of tomorrow is not necessarily the Greece of today, the objectives outlined by President Obama in his second inaugural address will not reduce the looming fiscal security threat. Clearing a path to legal citizenship for the 11 million illegal immigrants in America, further subsidizing the development of experimental (and unprofitable) alternative energy sources, penalizing the job and wealth-creating traditional energy industries, and limiting the development of the country’s vast shale oil deposits — while advocating higher tax rates for individuals and corporations — will not prevent America’s possible metamorphosis into a bloated reflection of present-day Greece.

The class warfare line of reasoning for increasing tax rates on what the Occupy Wall Street movement has dubbed the “top 1%” of American income earners will not solve America’s economic woes. Under the existing tax code, this so-called “top 1%” already pays more than 40% of all federal income tax and increasing their tax rates substantially — as President Obama has vowed to do — would only make a marginal contribution to paying down America’s debt. Neither punitive tax rates nor subsidized alternative energy projects will remove the fiscal albatross from the neck of the American economy. A realistic discussion followed by an action plan to reduce government entitlement programs and expenditures is what is required. However, the American public will likely remain unaware of this reality should President Obama pursue the priorities set out in his second inaugural address. But then again, this should come as no surprise since, in his first term, President Obama created his own presidential deficit reduction commission — the Simpson-Bowles National Commission on Fiscal Responsibility and Reform — only to ignore its recommendations. Instead, he increased the American deficit by more than $5 trillion in four years and then finished his first term with the American Congress having gone almost a thousand days without passing a budget.


Top Photo: US Magazine


Recent Posts