Fake job postings proliferate in layoff-hit tech industry

f you didn’t hear back about that great-looking tech position you applied for, it might not be because there were too many applicants scrambling to find a job amid rolling layoffs. There’s a distinct possibility the posting was fake to begin with.

We’re talking here about “ghost jobs” a practice of posting openings for positions that are fake, already filled, or intended for internal applicants and only opened to the public for legal purposes.

[…]

According to research published in August by MyPerfectResume, 81 percent of recruiters admitted to posting ghost jobs, with 41 percent saying half or more of the jobs they post are straight-up fake. Resume Builder similarly found by speaking to more than a thousand hiring managers that 40 percent of companies posted fake jobs in the past year, and that three in ten had active fake openings posted as of June, when it published its report.

Resumes, ghost openings are the lash

According to those two reports, the reasons companies post ghost jobs are, frankly, insidious. While some ghost posts are about collating a list of outside talent for future roles or making it appear like a company is growing when it isn’t, the other justifications basically boil down to torturing employees into working harder.

Resume Builder found that 63 percent of hiring managers posted ghost jobs to signal to overworked employees that relief was on the way, while 62 percent said they did it to make employees feel replaceable. MyPerfectResume found similar justifications, in addition to maintaining a presence on job boards while not hiring, “assess how difficult it would be to replace certain employees,” and “make the company look viable during a hiring freeze.”

[…]

Legal news site Above The Law noted earlier this year that there aren’t any laws against posting ghost jobs, meaning the practice is likely to continue as more tech workers find themselves adrift from a job and frantically looking for a safe harbor.

“It’s a concerning scenario, particularly when these misleading postings originate from HR departments — the very entities entrusted with shaping accurate perceptions of their organizations,” said Stacie Haller, chief career advisor at Resume Builder. “Whether it’s to create an illusion of company expansion or to foster a sense of replaceability among employees, such practices are not acceptable.”

Creating laws to eliminate ghost job postings may be easier said than done, Above The Law noted. It’s likely a state matter, tax attorney Steven Chung wrote, but he noted that any legislation would need to toe a fine line between enforcing fair postings and leaving companies in a position to have to hire people that might not be the right fit.

“Businesses have incentives to post job openings they do not intend to fill,” Chung said. “Governments should investigate this to make sure that job seekers are not led on wild goose chases while giving enough flexibility to allow businesses to hire anyone based on good business judgment.”

[…]

Apologies to the laid-off masses in the tech industry and beyond, but it looks like you’ll end up wasting time in your job hunt applying for fake positions, and it’s all perfectly legal. ®

Source: Fake job postings proliferate in layoff-hit tech industry • The Register

How The Army Will Use Its Super Integrated Air Defense System

Developed in partnership with Northrop Grumman, the Integrated Battle Command System, or IBCS, is the beating heart of the U.S. Army’s future air and missile defense architecture.

[…]

This system networks with current and future sensors and weapons platforms – regardless of source, service, or domain – to create an integrated fire control network that identifies and engages air and missile threats. Its modular, open and scalable architecture allows users a sensor-fused, highly accurate, rapidly actionable ‘picture’ of the full battlespace.

IBCS tackles evolving air and missile threats, from incoming drone swarms to hypersonic weapons, while creating a ‘any sensor, best shooter’ strategy. This enables operators to select the optimal effector for the situation.

[…]

The challenge lies in detecting and optimally engaging these diverse threats with all available defense systems.

Over the years, the U.S. Army has made significant investments in systems like Patriot, which is a medium-range air defense system, and THAAD, which is a system for intercepting short, medium, and intermediate-range ballistic missiles. These systems were traditionally designed to be tightly coupled between the command-and-control [C2], the sensors, and the effectors, making interoperability with other systems very difficult.

IBCS’s big idea is a network-enabled, Modular Open System Approach [MOSA]-designed command-and-control architecture, which essentially componentizes systems like Patriot. Meaning you remove the command and control – and then adapt the sensor [the Patriot radar] and adapt the launcher effector onto an integrated fire-control network.

The IBCS architecture integrates various sensors and effectors into a unified network. It is capable of collecting data from across the domains of ground, air, maritime, and space, to create a single integrated air picture that identifies all inbound threats.

An IBCS Engagement Operations Center is unloaded from a C-5 Galaxy transport aircraft. U.S. Army

[…]

There are essentially three major equipment items in IBCS. There’s the Engagement Operations Center [EOC], which you can think of as a shelter that mounts on the back of a five-ton truck and it’s got an antenna mast and has the communications onboard. The EOC is where the soldiers plan and fight the battle. This remotes into something that we call the Integrative Collaborative Environment [ICE], which is essentially a standard Army AirBeam tent. The ICE is where soldiers plan and fight the air battle. IBCS provides for remoting up to 10 operator workstations into the ICE. Within the EOC, you have two operator workstations which affords the capability for operators to employ and fight the system while the ICE is being established.
[…]

Today, with a standard U.S. Army Patriot, if you lose the Engagement Control Station at the battery level, then that battery is out of action. So culturally, this is a big change. Patriot was designed in the 1970s and was fielded in the 1980s, so employment thinking is still dominated by experience with Patriot. IBCS genuinely changes the paradigm for deploying whole battalions through its network architecture design that enables tailoring to enable employment of air and missile defense task forces. This means that rather than deploying a complete Patriot battalion, a commander could deploy a task force encompassing multiple types of sensors and effectors tailored for a specific mission. This provides commanders with a high degree of operational flexibility.

It’s also worth underlining the power of this open systems architecture. Traditionally, to perform a PAC-3 missile engagement, the uplink to the missile has to be performed through the radar. That’s very limiting because it means you have to deploy launchers in a position where they’re in proximity to the radar to be able to affect that uplink. This means you’re effectively constraining the range of the missile and the battlespace for performing engagements.

The Army has developed a capability called Remote Interceptor Guidance 360, or RIG-360, which is essentially an antenna uplink device that can be positioned at various locations on the battlefield. It removed the need to physically tether launchers and effectors to the location of the radar, so it’s an additional decoupling and dependency from a sensor.

[…]

IBCS is designed to communicate with other platforms and command-and control systems across a number of data links to include Link 16 datalink and MADL [Multifunction Advanced Data Link]. In flight testing, IBCS has demonstrated the capability to integrate with F-35. In addition, one of the engineering initiatives the Army has pursued with the Missile Defense Agency, and which we have supported, is a bridging technology known as the Joint Track Management Capability, or JTMC bridge.

The U.S. Navy has a very similar kind of system like IBCS called Cooperative Engagement Capability [CEC]. CEC takes data from multiple platforms, such as SPY-6 radar on AEGIS-class ships, E-2D Hawkeye, U.S. Marine Corps’ G/ATOR radar [AN/TPS-80 Ground/Air Task-Oriented Radar], and and integrates the data to create a high-fidelity quality track that is distributed across the network. The bridge enables the passing of data back and forth between the two networks to create a single integrated air picture.

TWZ: How does IBCS physically connect to the distributed systems at long ranges, and how might it plug into JADC2 in the future?

Lamb: IBCS is capable of being connected over long distances via fiber optics and satellite communications. We’ve demonstrated its ability to link with airborne platforms and sensors across various domains, with data displayed in command centers thousands of miles away.

[…]

over the last year or so we’ve been integrating the Army’s Lower Tier Air and Missile Defense Sensor, known as LTAMDS. The Army also has plans to integrate the latest Sentinel A4 radar, and it announced plans to integrate THAAD [Terminal High-Altitude Air Defense]. There’s also a budget for deeper integration of the F-35 fighter as well as with passive sensors.

[…]

Source: How The Army Will Use Its Super Integrated Air Defense System

TL;DR – this system takes all sensors into a central network and allows the what is detected to be fed to any weapons system, develop a firing solution and then engage. This means that if a hugely expensive patriot detects a tiny drone, you don’t need to engage the drone with that but can easily hand off the target to a cheaper weapons system and engage with that instead.

Using mathematics to better understand cause and effect

Consider an example from climate science. Experts studying large atmospheric circulation patterns and their impacts on global weather would like to know how these systems might change with warming climates. Here, many variables come into play: ocean and air temperatures and pressures, ocean currents and depths, and even details of the earth’s rotation over time. But which variables cause which measured effects?

That is where information theory comes in as the framework to formulate causality. Adrián Lozano-Durán, an associate professor of aerospace at Caltech, and members of his group both at Caltech and MIT have developed a method that can be used to determine causality even in such complex systems.

The new mathematical tool can tease out the contributions that each variable in a system makes to a measured effect — both separately and, more importantly, in combination. The team describes its new method, called synergistic-unique-redundant decomposition of causality (SURD), in a paper published today, November 1, in the journal Nature Communications.

The new model can be used in any situation in which scientists are trying to determine the true cause or causes of a measured effect. That could be anything from what triggered the downturn of the stock market in 2008, to the contribution of various risk factors in heart failure, to which oceanic variables affect the population of certain fish species, to what mechanical properties are responsible for the failure of a material.

“Causal inference is very multidisciplinary and has the potential to drive progress across many fields,” says Álvaro Martínez-Sánchez, a graduate student at MIT in Lozano-Durán’s group, who is lead author of the new paper.

For Lozano-Durán’s group, SURD will be most useful in designing aerospace systems. For instance, by identifying which variable is increasing an aircraft’s drag, the method could help engineers optimize the vehicle’s design.

“Previous methods will only tell you how much causality comes from one variable or another,” explains Lozano-Durán. “What is unique about our method is its ability to capture the full picture of everything that is causing an effect.”

The new method also avoids the incorrect identification of causalities. This is largely because it goes beyond merely quantifying the effect produced by each variable independently. In addition to what the authors refer to as “unique causality,” the method incorporates two new categories of causality, namely redundant and synergistic causality.

Redundant causality occurs when more than one variable produces a measured effect, but not all the variables are needed to arrive at the same outcome. For example, a student can get a good grade in class because she is very smart or because she is a hard worker. Both could result in the good grade, but only one is necessary. The two variables are redundant.

Synergistic causality, on the other hand, involves multiple variables that must work together to produce an effect. Each variable on its own will not yield the same outcome. For instance, a patient takes medication A, but he does not recuperate from his illness. Similarly, when he takes medication B, he sees no improvement. But when he takes both medications, he fully recovers. Medications A and B are synergistic.

SURD mathematically breaks down the contributions of each variable in a system to its unique, redundant, and synergistic components of causality. The sum of all these contributions must satisfy a conservation-of-information equation that can then be used to figure out the existence of hidden causality, i.e., variables that could not be measured or that were thought not to be important. (If the hidden causality turns out to be too large, the researchers know they need to reconsider the variables they included in their analysis.)

To test the new method, Lozano-Durán’s team used SURD to analyze 16 validation cases — scenarios with known solutions that would normally pose significant challenges for researchers trying to determine causality.

“Our method will consistently give you a meaningful answer across all these cases,” says Gonzalo Arranz, a postdoctoral researcher in the Graduate Aerospace Laboratories at Caltech, who is also an author of the paper. “Other methods mix causalities that should not be mixed, and sometimes they get confused. They get a false positive identifying a causality that doesn’t exist, for example.”

In the paper, the team used SURD to study the creation of turbulence as air flows around a wall. In this case, air flows more slowly at lower altitudes, close to the wall, and more quickly at higher altitudes. Previously, some theories of what is happening in this scenario have suggested that the higher-altitude flow influences what is happening close to the wall and not the other way around. Other theories have suggested just the opposite — that the air flow near the wall affects what is happening at higher altitudes.

“We analyzed the two signals with SURD to understand in which way the interactions were happening,” says Lozano-Durán. “As it turns out, causality comes from the velocity that is far away. In addition, there is some synergy where the signals interact to create another type of causality. This decomposition, or breaking into pieces of causality, is what is unique for our method.”


Story Source:

Materials provided by California Institute of Technology. Note: Content may be edited for style and length.


Journal Reference:

  1. Álvaro Martínez-Sánchez, Gonzalo Arranz, Adrián Lozano-Durán. Decomposing causality into its synergistic, unique, and redundant components. Nature Communications, 2024; 15 (1) DOI: 10.1038/s41467-024-53373-4

Source: Using mathematics to better understand cause and effect | ScienceDaily