It is expected that the sphere of data feeding the internet of things (IoT) will reach 75 billion devices in the next five years. That’s close to ten connected devices per human, and a threefold increase on the install base of 2018. This growing connectedness naturally feeds anxieties around how this data will be used to influence consumers.
In the face of major data leaks, it is natural that consumers may become more tech-averse as its intricacies operate ever more out of their own hands. In the UK, 43% of businesses identified an attack or security breach in 2020. It is incumbent on developers to assuage these concerns on the user end to avoid the risk of genuinely worthy and value-adding innovations being overlooked.
A 2022 consumer data privacy report by the Data & Marketing Association found that the majority of UK consumers (69%) had high levels of online privacy concerns. Less than half (46%) identified as ‘data pragmatists’, meaning a willingness to provide their personal data to businesses so long as there is a clear benefit in return.
Nearly a quarter (23%) said they were unwilling to share information for any reason. Given that the next generation of tech products and platforms, such as AI and IoT, will require implicit trust from a plurality of users, this reflects a serious issue with how tech has been communicated in the past.
Much of the discourse around this issue revolves around the ethics of companies choosing to sell or share user data to third parties. But perhaps not enough attention is given to the parallel issue of increasingly smart cloud-based tech ephemera creating discord between what different consumers will be able to do based on their level of trust or willingness to share sensitive personal details on digital services.
IoT products: Privacy matters
On the surface, it is a simple comparison: using analogue means a user has complete control over their information, while emerging digital products and platforms require trust in third parties to use their information securely and responsibly.
A handwritten letter, for instance, contains only the information the sender wishes to include. It is unlikely, if intercepted, to reveal sensitive personal information – and in the case that it does, the sender will have included that information with complete knowledge of, and responsibility for, the risks involved.
This is not always the case with digital services, as even the digitally native generation can struggle to explain the processes through which their emails are sent, or their instant messages delivered.
By the same ticket, it is fair to say that encryption of instant messages has (to date) proven more secure to intruding eyes than a paper envelope, and email servers more reliable in delivering to their intended audience than hand delivery by postal service.
This is to say that the risks and rewards of digitising personal information are, by nature, heightened when it comes to our current and emerging sophisticated suite of digital technologies – particularly in the field of cloud-based platforms, which not only demand greater sensitivity of information but also produce more sophisticated user data in return through the internet of business (IoB).
Evidently, there is a philosophical balance to be struck when it comes to the infrastructure feeding privacy – and most importantly, the confidence of the users who power it in their own security. Developers carry this responsibility to users (and, in turn, their client businesses) to develop software responsibly and ensure that users are along for the ride.
In my own experience running a digital design development agency, this is often at the forefront of our thinking. As an example, we partnered with a startup looking to launch a web and app platform in the rental real estate market. Alongside the typical work with core functionality and design, there were a number of features which offered genuine innovation on the user side – but which required users to pair their bank accounts to the app.
At face value, this is a perfect example of an area where user hesitancy could hamstring a digital innovation. The coupling of intimate personal data with a new product could create a sense of the ‘unproven’, meaning consumers are more likely to consider the possible downsides as outweighing the upsides – even when the reward is novel, innovative, and beneficial.
Our solution was to build the design around a drive to educate and assure users, covering how the product works, how their data will be used, and what security measures are in place to ensure user protection.
Beyond simply developing robust security measures, there should be a focus on communicating them incrementally, educationally, and most importantly, transparently. In many cases, this information is tucked away in small print – but with hesitancy growing, placing these considerations boldly on the front end is not only a necessary step, but a valuable marketing tool when attracting new users.
A better ecosystem for all
This is beneficial to the developers too. Developing products and platforms with swathes of impressive new features is always great – but not if they alienate concerned users or are ignored altogether in favour of the core offering. In independent research commissioned by Studio Graphene, taking in the view of 2,000 UK consumers, we found that users are fatigued by feature overload – with 60% feeling too much new tech is developed without proper thought as to whether it is really needed.
The tech sector should concentrate on providing genuine, value-adding innovation – and using design smartly to hold users’ hands through the onboarding process through to routine use. The research threw up a clear case-in-point; despite decades of effort and treasure poured into extolling the virtues of cloud, 50% of respondents said they had either not heard of or never used cloud products.
Just 15% felt they ‘regularly’ used cloud. This dissonance between tech saturation and literacy is symptomatic of limited thinking around welcoming users appropriately to new products and underlines the issues which emerging technologies will face if developed without appropriate consideration on the user side.
The wider spread of AI and IoT technologies will naturally lead many to think more carefully about what information they share – as it will become increasingly harder to retrieve. These natural risk factors, and any bad actors in the space, should not allow positive innovations to become muddied in the same waters.
Instead, developers should be proactive in communicating their own values through the product, and use design to provide users with the confidence, rather than the assumption, of being in safe hands.