09 January 2019 | Herpreet Kaur Grewal
Herpreet Kaur Grewal presents the findings from this month's Think Tank on organisations monitoring employee behaviours through IoT technology.
Swedish microchip implant company Biohax recently told a British newspaper that it was in talks with a number of UK legal and financial firms to implant staff with devices.
But TUC General Secretary Frances O'Grady said: "Microchipping would give bosses even more power and control over their workers. There are obvious risks involved, and employers must not brush them aside, or pressure staff into being chipped."
So how likely is it that IoT security technology will become common in
For this month's Think Tank, we asked: Is your organisation monitoring employee behaviours or maintaining security restrictions through IoT technology such as microchipping and sensors? Here's what you said.
Personal and corporate benefits
Today's visitor management tools enable companies to use facial recognition and automatic number plate recognition to allow people access to facilities, whether that be through an entry barrier, to a particular floor or a pre-programmed journey within a facility providing access to common areas and certain meeting rooms but not to sensitive corporate areas.
But research we conducted in the UK and US revealed a level of reluctance among people to use biometric data to check in as a visitor to a workplace. Of the 2,000 US and UK office workers we polled in the annual Office Workers Bugbear survey a third feel uncomfortable about providing personal data for check-in; 35 per cent are nervous about signing in via fingerprint, facial recognition, voice recognition software - feeling that it's unnecessary for the level and purpose of their visit.
For the individual it can mean a personalised approach to their employee or visitor experience where their preferences, from temperature and lighting to catering choices, are taken into account during their stay. For the company, it's insight into who is in the space, how they're using it and their specific preferences that can provide insight for corporate real estate, FM and catering decisions.
Gregory Blondeau, co-CEO of visitor management app Proxyclick
Professor Sir Cary Cooper CBE
Trust is key
Implanting microchips into employees at work is a controversial development. There are obviously some good reasons for doing this, like giving people access to different parts of the organisation that have secure areas, or needing to know where someone is given they are 'on call' in case of an emergency (e.g. in a hospital environment).
The problem is not the technology but how it is introduced and what is the 'real' intention of the monitoring. If there is a high level of trust in the organisation and in your line manager, and it is explained clearly what it is being used for and what it is not used for, then it is likely to be more favourably received. But if morale is low and there is a lack of trust in senior management and their intentions, it is likely to cause enormous resistance, and may in the medium term lead to trade union involvement or poorer performance or to higher levels of labour turnover. There has to be a clear set of guidelines on how it will be used and the activities that it won't be used for (e.g. tracking performance, time at lunch or during breaks, or who is communicating with whom).
Prof Sir Cary Cooper is professor of organisational psychology & Health, ALLIANCE Manchester Business School, University of Manchester, & director, Robertson Cooper Ltd
Expect a culture shift
We have long accepted elements added into the body such as piercings, stents, pacemakers, cochlear implants and IUDs. Implanted chips are a small step along from wearable technology (Apple Watch or FitBit) and have been happening for a few years in early-adopter organisations. The technology is straightforward; the moral and ethical implications less so.
How much accountability can an employee expect to provide? Security guards are already using 'proof of presence' systems such as NFC tags or barcode swipes to ensure that they have undertaken the correct patrols; engineers are tracked by GPS in their vehicles. Chips could provide more detailed, information: how much time did Geoff spend at the water-cooler this morning? Privacy appears to be the main issue. But younger people often have a more relaxed view, so perhaps we can expect a culture change about this.
Many people broadcast their location 24/7 through apps such as Snapchat to connect with friends. The key to acceptance of personal chipping will be the introduction of significant social, safety or financial benefits, all of which are easy to extrapolate from current common technologies (contactless payment, no lost door keys, or phones). Ultimately, the convenience is likely to make chipping an appealing option.
This technology will be ubiquitous pretty quickly. However, as part of the drive towards individuals rather organisations controlling and owning data, I think it will be driven by people having themselves tagged for their own benefit, then deciding what access an employer may have to the data.
Lucy Jeynes, MD of Larch Consulting
Beam us up, Scottie
My first thought on this topic was of 'The Borg'. Anyone of my generation may recall this Star Trek cybernetic organism that was linked in a hive mind to 'the collective'.
Technology is embedded in our work and our private lives. Young children already swipe left before picking up a pencil so it is clear that technology will continue to impact the very nature of our being. The idea that we would walk around with microchips inside us is not something I am comfortable contemplating. Historically, we have fiercely protected the individual's right to privacy. But then again, a lot of Star Trek tech is now a reality, so 'beam me up, Scottie', as Kirk would say.
Angela Love, director, Active Workplace Solutions
Humans 'going digital'
I already have a near field communication (NFC) chip implanted in my hand. It was created by a company called Dangerous Things and installed by their biohacker and CEO Amal Graafstra.
My role at Atalian Servest means I have to stay at the forefront of all aspects of technology and to test the 'why' and the 'how' in relation to human beings. As both organisations and industries undergo digital transformation it is key to remember that humans are the third element of this transformative activity. These changes are creating a myriad of new possibilities but at the same time are causing understandable anxiety. In the press we often see and hear about the idea of humans 'going digital' but we rarely stop to consider what this actually means. We live in a world of physical things (atoms) that we can see, and the digital world (bits) that we can't; we simply aren't biologically equipped to 'see' both realities. The rise of the consumerisation of IT means that we're surrounded by devices that access this digital world. At present, the reality of microchip implants is that they are generally passive devices and can only hold a small amount of data; my own chip has been used to replace my corporate door pass in the past and at present holds my business card data, so I can just tap my hand onto someone's device to give them my contact details.
The smart organisations are the ones who are aware of the potential and are experimenting now, not so much with the technology itself but with developing a sense of how it will be received by humans.
Lewis Richards, CDO, Atalian Servest
Are we ready for personal microchipping and sensors?
18.18% - Yes, it is definitely happening and will only become more popular
45.45 % - No, it is not happening
36.36 % - It is happening in some circumstances, but is unlikely to become popular