By Dustin Guttadauro, Product Manager, L-com Global Connectivity
Defense Systems Put Ethernet
Cable and Connectors to the Test
Rugged cables and connectors deliver improved safety in demanding conditions.
The Department of Defense COTS initiative is an admirable goal that’s proven its ability to save money and reduce delivery time, while still meeting defense systems
requirements. The challenge for designers is determining
whether a specific commercial component is acceptable
for defense service. This conundrum is also applicable to
Ethernet cables and connectors, although tests have shown the
commercial variety can meet an untimely end. Fortunately,
manufacturers have ruggedized Ethernet cables and connectors
using a variety of techniques, so it’s important to understand
them before making a decision.
Ethernet is increasing its presence in defense systems (Figure 1)
for the same reasons it is ubiquitous elsewhere: a 37-year record
of continuing advancement, inherent interoperability that fosters
modularity, the ability to eliminate a power cable through Power
over Ethernet (PoE), support for both copper and optical fiber,
and a huge base of suppliers, to name a few. These benefits and
others are making Ethernet the “go-to” data communications
solution for use in many types of defense platforms.
That said, ground- and sea-based systems can expose Ethernet
cables and connectors to extreme cold, blistering heat, salt spray,
and chemicals, along with being run over by trucks used to drag
around equipment racks and other rough treatment. Airborne
platforms have their own unique demands. Other considerations
include the possible need for low smoke zero halogen (
LSZH)-rated cables. In addition, Ethernet’s continuous upward spiral in
performance is countered by the rapid advances in optical, RF,
and microwave sensors that generate massive amounts of data
and need extremely high throughput.
Collectively, environmental and physical hazards; proliferation
of high-resolution sensors; data rates of at least 10 Gb/s; and
the presence extraneous low- and high-frequency signals, make
ensuring signal integrity a major design challenge from the
component through system levels.
Maintaining Optimum Signal Integrity
The first step in specifying a ruggedized Ethernet cable is
a thorough evaluation of the conditions to which it will be
subjected. This includes environmental factors like minimum
and maximum temperatures, exposure to corrosive or other
hazardous chemicals, resistance to shock and vibration,
frequent connection and reconnection, and others dictated by
the application.
Another major issue increasing in severity is the presence of
unwanted signals and extraneous electrical noise, which is a
concern in any environment, but worse in aircraft where large
amounts of electronic equipment is close together. Ensuring the
electrical performance of Ethernet cables and connectors is not
degraded by low- and high-frequency electromagnetic energy
and crosstalk is thus essential.
The degree to which this can be achieved depends on the
performance of shielding within the cable, between cable and
connector, and connector interface. The requirements of many
applications can be satisfied with cables using foil shielding, but
when cables are used in the presence of high-power RF and
microwave signals, the highest shielding effectiveness is needed
through a combination of both foil and braid. Manufacturers
developed proprietary techniques for achieving high shielding
effectiveness, so it’s important to review their offerings.
Figure 1: The Army’s Warfighter Information Network-Tactical
(WIN-T) relies on the performance of Ethernet for its role in
acting as a comprehensive platform for distributing intelligence,
surveillance, and reconnaissance information including voice,
data, and real-time video. (Image Source: U.S. Army)