With a hull made of carbon fiber material, and topside surfaces shaped to deflect radar, the Visby is hard to spot electronically. Travelling at less than 22 kilometers an hour (13 in rough seas), the Visby is practically invisible to radar.
The 650 ton ships are armed with a 57mm gun, plus eight RBS-15 anti-ship missiles, as well as anti-submarine torpedoes, mines or depth charges. The crew is small (43), but the ship can move fast (about 70 kilometres an hour) in all kinds of weather. The Visby had radar, sonar and thermal imaging equipment. The ship is 240 feet long, 34 feet wide with a draft of less than 3 metres.
Propulsion is via waterjets, which makes the ships harder to detect by submarines. The Visby ships can also carry a helicopter, and is equipped with hull mounted and towed array sonars for hunting Russian subs off the Swedish coast. Marport C-Tech was a major supplier of sonar equipment to the Visby class corvettes.
Five Visby class ships have been built, and all will be in service within three years.
Many foreign navies have shown significant interest in the Visby technology.
Carbon dioxide emissions from human activities aren’t just warming the planet. Another problem of rising atmospheric carbon dioxide is that CO2 is being absorbed by the oceans, which increases seawater acidity (lowers the seawater pH). This process, termed ‘ocean acidification’, has received growing scientific and public interest because it threatens certain groups of marine organisms, including corals. Only recently have researchers realized that human-made carbon dioxide not only warms and acidifies the ocean — it also affects acoustical properties of seawater, making it more transparent to low-frequency sound.
Oceanographers Tatiana Ilyina and Richard Zeebe of the School of Ocean and Earth Science and Technology at the University of Hawaii write in the journal Nature Geoscience that seawater sound absorption will drop by up to 70% during this century. The scientists have examined the effects of man-made carbon dioxide under business-as-usual emissions and provide projections of the magnitude, time scale, and regional extent of changes in underwater acoustics resulting from ocean acidification.
When carbon dioxide dissolves in seawater, it produces carbonic acid and increases the hydrogen ion concentration (acidity). The seawater pH has declined by about 0.1 units compared to preindustrial levels — corresponding to about 25% increase in acidity. These changes may appear small, but pH is measured on a logarithmic scale — analogous to the Richter scale, which measures the strength of Earthquakes. For example, a drop of pH by one unit implies a ten-fold increase in acidity. Low-frequency sound absorption depends on the concentration of dissolved chemicals such as boric acid, which in turn, depends on seawater pH. This is the reason why changes in seawater pH affect ocean acoustics.
“If we continue to emit carbon dioxide at business-as-usual rates, the pH of surface seawater will drop by 0.6 units by the year 2100. As a result, the absorption of 200 Hz sound would decrease by up to 70%,” says Tatiana Ilyina. For example, the middle C of the piano is tuned to 261.6 Hz; in the ocean, sound around this frequency is produced by natural phenomena such as rain, wind, and waves), by marine mammals, and by human activities such as construction, shipping, and use of sonar systems.
“Most people know that when they turn on the air conditioner or drive a vehicle, they emit carbon dioxide, which causes climate change and ocean acidification. The surprise now is that it also affects sound absorption in the ocean,” says Zeebe. “What is happening over time is that the low frequencies become louder at distance. It’s similar to the effect when you slowly turn up the bass on your stereo.”
However, underwater sound propagation is much more complex; it depends on spatial distribution of sound sources and environmental parameters. Some areas in the ocean will be affected more strongly than others. Areas with large sound absorption reduction and intense noise sources, for example from shipping, could become “acoustic hot spots” in the future. The largest changes are projected to occur in the surface ocean waters in high latitudes, for instance, in the North Pacific and in the Southern Ocean, and in the areas of deep water formation such as the North Atlantic, where man-made CO2 invasion is the greatest.
Sound can travel farther at depth of about 1000 m (the depth of the so called deep sound channel) than at the surface. Most of the anthropogenic and natural sounds are generated at the surface, but they can leak into the deep sound channel, bend there, and travel over thousands of kilometers in the ocean. “With time, as anthropogenic CO2 penetrates into the deep ocean, the changes in sound absorption will also propagate well below the deep sound channel axis,” says Ilyina. “Sound absorption will continue to decrease even after reductions in CO2 emissions because ocean pH will continue to decrease.”
Human activities such as naval, commercial, and scientific applications extensively use low-frequency sound due to its long-range propagation. Also marine mammals rely on low-frequency sound to find food and mates. As a result, ocean acidification may not only affect organisms at the bottom of the food chain by reducing calcification in plankton and corals, but also higher trophic level species, such as marine mammals by lowering sound absorption in the ocean.
“We don’t fully understand what the impacts of these changes in ocean acoustics will be,” says Ilyina. “Because of decreasing sound absorption, underwater sound could travel farther, and this could lead to growing noise levels in the oceans. Increasing transparency of the oceans to low-frequency sounds could also enable marine mammals to communicate over longer distances.” The scientists say that further research is needed to address these questions.
Controversy is growing in Norway over a sunken wartime German U-boat whose toxic contents threatens a key fishing region.
The cost of raising the wreck of the U-864 which contains 65 tons of mercury, is now estimated to cost between 1.2 and 2.2 billion Norwegian krone (C$215 – 396 million) , twice as much as estimated earlier.
The shipwreck was located in March 2003 by the Royal Norwegian Navy 2.2 miles west of the island of Fedje in the North Sea, at 150 metres depth. The wreck, which is close to an area fished by EU vessels, has long been considered an environmental hazard by the local fishermen and environmental groups.
U-864’s mission was to transport military equipment to Japan destined for the Japanese military industry, including approximately 67 tons of metallic mercury in 1,857 32kg steel flasks stored in her keel. That the mercury was contained in steel canisters was confirmed when one of the canisters was located and brought to the surface during surveys on her wreck in 2005.
Experts had long disagreed on whether or not the wreck should be raised or if it would be better to build a sarcophagus which would isolate the mercury from the marine environment, thereby eliminating the pollution hazard. A three year study by the Norwegian Coastal Administration has recommended entombing the wreck in 12 metre thickness of sand, with a reinforcing layer of gravel or concrete to prevent erosion. This was proposed as a permanent solution to the problem, and the proposal notes that similar techniques have been successfully used to contain mercury-contaminated sites.
But just under a year ago the Oslo Government decided that the wreck should be raised, and a Dutch salvage company won a bid to raise the wreck at a cost of NOK 800 million (C$144 million). Helga Pedersen, the Norwegian fisheries minister at the time, said she was aware of the concern by the local fishing community about contamination.
The U-864 was launched in 1943 and sunk by the Royal Navy sub HMS Venturer in what is thought to be the only encounter of one submarine sinking another while actually submerged. All 73 men on board U-864 perished.
The U.S. Navy had discovered that the training it gives it’s submarine crews is sometime not keeping up with the complexity of new equipment. Case in point is the collision earlier this year between the submarine USS Hartford and the USS New Orleans, an American amphibious ship.
The sub was at periscope depth, and the men on the bridge had been tracking the amphibious ship for nearly an hour. But the sonar data, and the automatic identification signals being received from another ship (moving in the same direction as the LPD, and apparently confused with the LPD) led the crew to ignore the sonar data indicating an imminent collision.
The navy investigation of the incident blamed specific crew members for allowing the collision to happen, but also noted that there were a lot of sensors involved, and the navy procedures did not clearly deal with what you should do when conflicting data is being received. Nuclear subs rarely spend this much time near the surface, and have lots more sensors to detect what’s above, and around the sub. Even the periscope is a much more complex instrument, containing radar and image manipulation devices, along with the traditional visual information. The conclusion was that, without some new types of training, it’s too easy to become confused by the flood of data. This, in part, was one of the causes of the Hartford accident.
The accident itself consisted of a 24,000 ton USS New Orleans, colliding with the submerged USS Hartford (a 7,000 ton Lost Angeles class boat), in the narrow Straits of Hormuz, at 1 AM, local time. Fifteen sailors aboard the sub were injured, while a fuel tank on the USS New Orleans was torn open, and 25,000 gallons of fuel oil got into the water. The USS Hartford rolled 85 degrees right after the collision, and substantial damage was done to the sail, including a leak.
The captain and chief of the boat (senior NCO) were dismissed shortly after the March 20 collision. The USS Hartford went to a Persian Gulf shipyard for emergency repairs (a metal brace for the sail, which was twisted so that it leaned to the right). Temporary decking, railing and antennas were added to the topside of the sub, to make it easier for the surface ride home.
Initially, the accident was blamed on sloppy leadership by the captain, and the senior chief petty officer. The subsequent investigation found that lax discipline was tolerated on throughout the ship. This led to sloppiness. In particular, the crew did not take all the precautions mandated for passing through a narrow waterway like the Straits of Hormuz. The investigation found many specific errors the crew made, that contributed to the collision. This included supervisors not staying with the sonar operator, who, it turned out, was chatting with someone when the collision (that the sonar would have provided warning about) occurred. The navigator was doing something else, while listening to his iPod, while the officer in charge did not, as he was supposed to do, check the surface with the periscope. The list went on, and ultimately amounted to 30 errors in procedure.
Accidents like these are part of a larger problem in the navy; finding and retaining sailors capable of running a nuclear submarine. Sub commanders are under a lot of pressure to keep their sailors from leaving the navy. But the long periods submarine sailors spend away from their families, creates pressure to get out and take a civilian job close to home. The USS Hartford had been at sea for five months when it had the accident.
Submarine sailors are very capable, and highly trained, people. Getting a better paying civilian job is not a problem. So sub captains try to keep the crews happy. That often leads to lax discipline. And that often leads to these collisions. Many sub captains see this as a calculated risk, as they know that, in wartime, their highly skilled crews would snap together and do the job. But a sub commanders first priority, at least in peacetime, is the safety of his boat. In wartime, the mission comes first.
There’s precedent for this. During the early days of World War II, the U.S. Navy had to replace most of its sub captains. These men had risen to their positions in the peacetime navy by doing things by the book and always adhering to procedure. Moreover, the peacetime sub operations did not include the kind of unexpected, and highly stressful, situation typical of wartime. But in combat, you needed much more flexible commanders, and these were the ones who came in and won the American submarine war in the Pacific. The navy has found that the flood of new technology is creating unexpected situations, that crews have to be warned about, and trained to handle.
For the first time, Pentagon planners in 2010 will include climate change among the security threats identified in the Quadrennial Defense Review, the Congress-mandated report that updates Pentagon priorities every four years.
The reference to climate change follows the establishment in October of a new Center for the Study of Climate Change at the Central Intelligence Agency.
But the new attention to climate concerns among U.S. security officials does not mean the Pentagon and the CIA have taken sides in the debate over the validity of data on global warming. As with nuclear terrorism, deadly pandemics or biological warfare, it only means they want to be prepared.
“I always look at the worst case,” says one senior intelligence official who follows climate issues. “Whether it’s global warming or the chance of Country A invading Country B, I just assume the most likely outcome is the worst one.”
Military officials, accustomed to drawing up detailed plans for a wide variety of contingencies, have a similar view.
“The American people expect the military to plan for the worst,” says retired Vice Adm. Lee Gunn, a 35-year Navy veteran now serving as president of the American Security Project. “It’s that sort of mindset, I think, that has convinced, in my view, the vast majority of military leaders that climate change is a real threat and that the military plays an important role in confronting it.”
Among the scenarios that concern security planners is the melting of the massive Himalayan ice mass. In theory, the rivers fed by the Himalayan glaciers would flood at first, then dry up once the glaciers retreat. That would endanger tens of millions of people in lowland Bangladesh.
Retired Air Marshal A.K. Singh, a former commander in India’s air force, foresees mass migrations across national borders, with militaries soon becoming involved.
“It will initially be people fighting for food and shelter,” Singh says. “When the migration starts, every state would want to stop the migrations from happening. Eventually, it would have to become a military conflict. Which other means do you have to resolve your border issues?”
The drafters of the Quadrennial Defense Review were instructed by Congress to accept the assessments of the Intergovernmental Panel on Climate Change (IPCC), the international body established by the United Nations and the World Meteorological Organization to gather and report world climate data.
Neither the Pentagon nor U.S. intelligence agencies make an independent effort to assess the planet’s climate, and U.S. security officials have generally tried to distance themselves from any debate over the validity of the IPCC data. Instead, they focus on the security repercussions.
“The [IPCC] projections lead us to believe that severe weather events will increase in intensity in the future, perhaps in frequency as well,” says Amanda Dory, the deputy assistant secretary of defense overseeing the review process. “This is a mission area where the Department [of Defense] already responds on a regular basis in support of civil authorities, whether for floods, wildfires [or] hurricanes. We believe there’s a possibility those types of requests will increase in the future.”
Climate change could also have implications for ship and aircraft designers.
“When you talk about building ships that are going to last from 30 to 50 years or programming for aircraft that are not going to be put in the air for 20 years, you have to be thinking about the kinds of changed conditions into which you’re going to throw them in the future,” Gunn says.
Still, there is only so much military planners can do to prepare for the consequences of climate change. The 2010 Quadrennial Defense Review, due to be delivered in February, is required to identity what global warming may mean for the Defense Department’s “roles, missions and installations.”
But Dory of the Pentagon says there won’t be much change in that area.
“We don’t anticipate that there are new mission areas as a result of climate change,” Dory says. “Similarly, there may be changes in technical specifications for platforms, but not the need for new types of platforms that we don’t already possess.” (In Pentagon jargon, “platforms” are the things on which weapons are carried, like ships or aircraft.)
In the short term, climate change may be a more important subject for intelligence officials than for military planners.
Analysts at the National Intelligence Council are trying to develop a set of early warning signs that could suggest where the next famine might arise or which countries are in most danger of being destabilized as a result of dramatic climate changes. Intelligence officials put those countries on a “stability watch list.”
But how far to go with such climate and security projections is a matter of dispute.
“We suck at predicting wars, and we’re not very good at predicting peace,” says James Carafano, a retired Army officer and former West Point instructor who now directs foreign policy and national security studies at the Heritage Foundation. “These are huge, giant, complex systems, and people who take a linear approach to these things and say, ‘Oh, well, if this happens, then we’ll have to worry about that’ — that’s not how reality works out.”
Perhaps not, but it’s the job of national security officials at least to imagine future climate and security scenarios, whether they can do something about them or not.
Work has finally begun on the U.S. Coast Guard’s latest Deepwater addition: the fast response cutter.
Bollinger Shipyards of Lockport, La., began construction in late November on Sentinel, the first in a class of the fast response cutters. The Coast Guard awarded a contract option for about $141 million to Bollinger Shipyards on Dec. 15 to begin production on three additional fast response cutters. The second cutter will be called Guardian, but the third and fourth hulls have not been named, said Laura Williams, a spokeswoman for the acquisitions directorate at Coast Guard headquarters in Washington.
The design for the 154-foot patrol boats successfully cleared a critical design review in mid-November and the Homeland Security Department’s Acquisition Review Board earlier this month. In September 2008, the Coast Guard awarded Bollinger an $88 million contract for the lead Sentinel. The initial patrol boat, which will be home ported in Miami, is expected to be delivered to the Coast Guard in the third quarter of fiscal 2011.
The Sentinel-class contract is worth up to $1.5 billion if all options for 34 cutters are exercised. The 154-foot patrol boats will replace the aging 110-foot Island-class patrol boats. The longer boats allow for larger crews – 23 people versus 16 – which the Coast Guard felt were needed. The larger cutters also handle better in 8-foot seas and have centralized berthing, which reduces crew fatigue in stormy weather, he said. The cutter will be outfitted with communications and computer equipment that will allow the crew to communicate with the cutter’s rigid-hull inflatable boat team beyond the horizon – another advantage over the Island class.
The other capabilities will remain the same – the fast response cutters will have a flank speed of 28 knots and be able to perform independently for a minimum of five days at sea. The cutters will be used in drug and migrant interdiction, fisheries enforcement, search and rescue, and port security.
The U.S. House and Senate appropriators agreed to a $636.3 billion defense budget for 2010 on Dec. 15, defying the Obama administration by buying more C-17s, an alternate engine for the Joint Strike Fighter and by refusing to kill the presidential helicopter.
The sharpest jab at Defense Secretary Robert Gates and President Barack Obama came in the form of a $2.5 billion add-on for buying 10 more C-17 cargo planes.
Gates wanted to end the program, contending that the Air Force has plenty of C-17s and other airlift planes.
But the C-17 is a popular jobs program in at least a dozen states, so the House voted to spend $1.2 billion to buy three more planes and the Senate voted to spend $2.5 billion for 10 more. The lawmakers decided to compromise by accepting the Senate’s plan.
Agreement on the alternate engine went more the House’s way. Again, Gates argued against spending any money on it, and the Senate sided with him. But the House voted to spend $560 million to keep developing the engine. The House gave in a little, the Senate gave in a lot and the compromise version of the spending bill now includes $465 million for the engine.
The alternate engine program is intended to develop an alternative to the Pratt & Whitney F135 engine. Lawmakers argue that if problems develop with the F135, they could ground much or all of the Joint Strike Fighter fleet, which will gradually comprise the bulk of the Air Force, Navy and Marine Corps fighter fleets. The alternate engine would provide another option. It is being developed by General Electric and Rolls Royce, and development alone is expected to cost about $5 billion. Gates argued that the alternate engine was a waste of money.
The decision to keep funding the presidential helicopter may be seen as a partial victory for Gates. The beleaguered helicopter program receives $130 million in the new appropriations bill. Of that, $100 million is for “technology capture” so that the $3.3 billion already spent on the VH-71 won’t be wasted. The House wanted to spend $485 million to “operationalize” five helicopters that are already mostly built. Gates pulled the plug on the program last spring after the cost for 23 helicopters increased from $6.5 billion to $13 billion.
Appropriations conferees also agreed to spend $15 billion on new ships – $120 million more than Gates requested. That would pay for seven ships: one DDH-51 destroyer, one attack submarine, two Littoral Combat Ships, one joint high-speed vessel and two T-AKE cargo ships.
In a report on their compromise bill, lawmakers complained that the shipbuilding plan for 2010 “once again falls short” if the 10 ships needed annually to increase the fleet to 313 ships.
Congress has until Dec. 18 before the Defense Department runs out of money, but that doesn’t mean the Defense Appropriations bill must pass by that date.
The military – and most of the rest of the U.S. government – has been operating under a “continuing resolution” since Oct. 1 because most appropriations bills did not pass in time for the start of the new budget year. Lawmakers may simply pass another continuing resolution and postpone a final vote on the Defense Authorization bill.
After four years out of operational service for a mid-life re-fuelling, complex overhaul (RCOH) and post-shakedown availability/supplemental restricted availability (PSA/SRA) work, the Nimitz-class aircraft carrier USS Carl Vinson (CVN 70) sailed out of Northrop Grumman’s Newport News shipyard on 3 December to rejoin the US Navy’s active fleet.
By 13 December, F/A-18C Hornets from Strike Fighter Squadron 34 and SH-60 Seahawk helicopters, among other aircraft, were participating in two days of carrier flight deck certifications for air department personnel.
Begun in November 2005, Carl Vinson’s USD3.12 billion RCOH has seen the 91,500-ton vessel stripped out and refurbished from the keel up. The refit involved the modernization of some 2,300 compartments.
As part of the vessel’s CAPSTONE combat system upgrade, Raytheon’s Rolling Airframe Missile (RAM) system replaced one of the ship’s two Mk 29 launchers and both Phalanx Vulcan 20 mm close-in-weapon system mounts.
The island superstructure was reconfigured and now bears a 70-ton main mast and updated sensors, similar to those equipping the final Nimitz-class carrier, USS George H W Bush (CVN 77).
Marport C-Tech has successfully recertified its Quality Management System to ISO 9001:2008, the global benchmark for standards of excellence. The company was last certified to the ISO 9001:2000 in 2006. After completion of the three year validity period, Marport C-Tech recently underwent a stringent recertification audit.
ISO 9001:2008 is a comprehensive management system standard maintained by ISO, the International Organization for Standardization.
It is administered by accreditation and certification bodies. Although the standard originated in the manufacturing sector, it’s now in use across many industries.
Some of the requirements in ISO 9001:2008 include:
- A set of procedures that cover all key processes in the business
- Monitoring processes to ensure they are effective
- Keeping adequate records
- Checking output for defects, with appropriate and corrective action where necessary
- Regularly reviewing individual processes and the quality system itself for effectiveness
- Facilitating continual improvement
A company or organization that has been independently audited and certified to be in conformance with ISO 9001:2008 may publicly state that it is “ISO 9001 certified” or “ISO 9001 registered”.
The recertification to ISO 9001:2008 quality standards is a reiteration of the Marport C-Tech commitment to continual improvement, a vital link in our journey towards excellence. We will continue to review our procedures to ensure excellence across all aspects of operations.
A new analysis of the geological record of the Earth’s sea level, carried out by scientists at Princeton and Harvard universities and published in the Dec. 16 issue of Nature, employs a novel statistical approach that reveals the planet’s polar ice sheets are vulnerable to large-scale melting even under moderate global warming scenarios. Such melting would lead to a large and relatively rapid rise in global sea level.
According to the analysis, an additional 2 degrees of global warming could commit the planet to 6 to 9 metres of long-term sea level rise. This rise would inundate low-lying coastal areas where hundreds of millions of people now reside. It would permanently submerge New Orleans and other parts of southern Louisiana, much of southern Florida and other parts of the U.S. East Coast, much of Bangladesh, and most of the Netherlands, unless unprecedented and expensive coastal protection were undertaken. And while the researchers’ findings indicate that such a rise would likely take centuries to complete, if emissions of greenhouse gases are not abated, the planet could be committed during this century to a level of warming sufficient to trigger this outcome.
As part of the study, the researchers compiled an extensive database of geological sea level indicators for a period known as the last interglacial stage (about 125,000 years ago). Polar temperatures during this stage were likely 3 to 5 degrees Celsius warmer than today, as is expected to occur in the future if temperatures reach about 2 to 3 degrees Celsius above pre-industrial levels.
Sea levels during the last interglacial stage are of interest to scientists and important to policymakers for several reasons. Most notably, the last interglacial stage is relatively recent by geological standards, making it feasible for climate scientists to develop a credible sea level record for the period, and is the most recent time period when average global temperatures and polar temperatures were somewhat higher than today. Because it was slightly warmer, the period can help scientists understand the stability of polar ice sheets and the future rate of sea level rise under low to moderate global warming scenarios. The findings indicate that sea level during the last interglacial stage rose for centuries at least two to three times faster than the recent rate, and that both the Greenland and West Antarctic ice sheet likely shrank significantly and made important contributions to sea level rise.