The intersection of the hybrid/remote working trend and paranoia about IP theft in a cutthroat global marketplace, has driven employers to ratchet up surveillance of their employees in recent years.
But this intersection is likely to see many collisions involving legal and ethical considerations, fueled in part by sales of virtually voyeuristic surveillance software projected to top $12 billion by 2033.
“We really need to tread carefully,” Nikki Rivers, a corporate labor associate at Mintz, Levin, Cohn, Ferris, Glovsky and Popeo warned during a panel discussion this week at the Association of Corporate Counsel annual meeting in Nashville, Tennessee.
Keeping an eye on workers is as old as time. Slackers have always been with us. But in an information-based economy employers fret not only about time theft but that employees will stuff their jackets with the company’s crown jewels on the way out the door.
With discrete video surveillance today, “you can (literally) see if they’re trying to hide something under their jacket,” said Phuong Nguyen, senior manager of cybersecurity, data protection and insider risk management at Ernst & Young.
Of course, video is nothing compared to the sophisticated tools available today.
Keystroke logging tracks activities on the employee’s devices. Email is a technical cinch to monitor. GPS allows employers to determine the precise location of the employee.
And that’s to say nothing of emerging AI tech that could crunch various inputs to gauge employee emotions and perhaps even their intentions.
Employers defend surveillance practices as not necessarily “spying”—not collecting information for collecting’s sake—but to protect the enterprise while attempting to balance that with employee rights of privacy, said Lewis Dolezal, senior counsel at construction products-maker CPG.
Sometimes, for example, the company has eyes on an employee who is on a performance improvement plan or on a termination list—potential employee stressors that could lead to nefarious behavior, noted Nguyen, a former U.S. Air Force signals intelligence officer.
“At their core, these technologies have a good purpose,” Rivers said. On the other hand, she worries companies might go too far. She questioned why an employer would go as far as demanding employees be subject to retinal scans to clock-in for work, for example.
“Why do we need it?” Rivers said of that application. “That is far and above where we need to go.”
Such applications could run afoul of biometric data usage and storage laws, especially when swiping an access card to log-in would be perfectly adequate, she added.
Panelists also urged care in the use of technology as a predictive tool, open to misinterpretation.
Take, for example, the otherwise exemplary employee who is flagged for using company equipment to access a job-searching site.
Some companies would conclude that employee is looking to leave the company, which could then initiate efforts to terminate the worker if only out of concern they might use their remaining time to gather sensitive company information for use at a new job with a competitor.
Best Birth Injury lawyers near me | | Best Copyright lawyers near me | Best Divorce lawyers near me | | Best Foreclosure Lawyers near me | Best Immigration Lawyers near me
But it could be that employee was simply glancing at a job listing for a family member, Nguyen said, such as for a teen moping about at home.
Now, that employee who had no intention of leaving is going to have to, Rivers lamented.
Use of surveillance technology has numerous other potential to implicate violations of federal and state laws, such as Illinois’ biometric privacy law, the California Privacy Rights Act of 2020 and the National Labor Relations Act. The latter affords employees certain protections to discuss terms and conditions of employment.
Bus Accident Lawyers Near Me | Best Car accident lawyer near me | Best Child custody lawyers near me | Child support lawyers near me
The Federal Wiretap Act and the Stored Communications Act may also come to play. But Europe has in some respects more restrictive employee surveillance and data collection laws such as under the General Data Protection Regulation Act.
In a survey of C-suite executives released last month by the employment law firm Littler Mendelson, nearly 40% respondents said their organization used automated monitoring systems to track generative AI use among employees
in its report on the study, Littler called the hefty percentage surprising, “given the ongoing focus on privacy protections in the workplace. … Employers that do so should be diligent about protecting employee data, adhering to labor-law obligations, and avoiding any perception of bias or discrimination.”
Estate planning Lawyers near me | Best Family law Lawyers near me
The panel pointed to studies indicating that just one-third of employees receive clear guidelines about monitoring practices.
George Dunston, chief privacy officer and associate general counsel of Barnes & Noble, said his company has an employee privacy policy. He read from sections noting the company does not use surveillance as a tool to control the working activity of employees but rather only to protect its legitimate business interests.
Best Criminal Defense Lawyers Near Me | Drunk driving accident lawyers near me | Best Dui Lawyers near me | Elder law Lawyers near me | Best Employment law Lawyers near me |
Such policies may be helpful not just for legal reasons. According to data cited by the panel, 27% of employees consider quitting if subjected to surveillance, with nearly 60% questioning the morality of such practices.
“Additionally, the impact on job satisfaction, burnout and micromanagement is significant, with many employees feeling stressed and over-observed,” stated the panel’s synopsis.
Dunston urged general counsel to obtain buy-in from the C-suite and from the board regarding surveillance and privacy policies. That includes involving all stakeholders in the company to define purpose and scope for legitimate business needs, balanced with employee morale and legal compliance.
Panelists also suggested obtaining employee notification and consent for monitoring, along with regular training on technology use.
Best accident lawyer near me | Civil Litigation Lawyer near me | Consumer Protection Lawyer near me | Best Bankruptcy lawyers near me | | Best Insurance claims Lawyers near me | | Best Tax Lawyers near me | Best Truck accident Lawyers near me | Best Workers compensation Lawyers near me | Best Wrongful Death Lawyers near me
“Really look at your acceptable use policies,” Rivers emphasized.
The consequences for not using surveillance technology lawfully can be expensive.
In January, the French Data Protection Agency fined Amazon $35 million, alleging it was using an “excessively intrusive system” to monitor warehouse worker scanning speeds and periods of inactivity, putting the company in violation of the European Union’s General Data Protection Regulation.
Nursing home abuse Lawyers near me | Best Personal injury Lawyers near me | Best Police brutality Lawyers near me | Best Probate Lawyers near me| Sexual harassment Lawyers near me
Amazon denied the allegations, calling them “factually incorrect” and defending its warehouse management systems as “industry standard” and “necessary for ensuring the safety, quality and efficiency of operations and to track the storage of inventory and processing of packages on time and in line with customer expectations.”
Panelists said AI adoption is creating the potential for even greater mistakes. How, for example, do you make sure an AI tool doesn’t have a bias built-in, Nguyen wondered.
Best Medical malpractice Lawyers near me | Best Mesothelioma Lawyers near me | | | Best Social security disability Lawyers near me
As surveillance tech rapidly develops, employers need to pause and think thoroughly about the implications of its use, Rivers said. She challenged employers to ask what is the “least intrusive” way to accomplish the task.
“Employee trust is a big one that I keep coming back to.”