Cybersecurity and Usability


The topic of cybersecurity and usability is a topic I have covered before though with a focus on IoT. This post will aim to take a broader view along with covering additional ideas and concepts. It is subject that cannot be learned once and then filed away since with every new design there is a risk of forgetting the design principles for usability plus often there is a chance when usability is considered from the start it may be weakened by feature creep and changing requirements. Therefore it is always worth reminding yourself and refreshing the principles behind usability in cybersecurity. 


Getting the balance between cybersecurity and usability is critical because at the either extreme we can make systems secure enough to never be attacked but this would also mean no one could ever access or use them conversely systems that are really easy to use might have little or no security though it is still possible to design a system with no security and be difficult to use. 

Many employees may treat cybersecurity as secondary to their normal day-to-day work, often leaving their organisation vulnerable to cyber attacks, particularly if they are stressed or tired. When users perform tasks that comply with their own mental models (i.e. the way that they view the world and how they expect it to work), the activities present less of a cognitive challenge than those that work against these models.

If a user can apply their previous knowledge and experience to a problem, less energy is required to solve it in a secure manner, and they are less mentally depleted by the end of the day in theory. It is worth remembering that human beings are not always consistent on any given day so there should always be tolerance and lee-way built in to mitigate any mistakes if possible. 

Building on existing mental models makes it easier for people to adopt new technologies and ways of working. However, such mappings must take cultural background into consideration as a design that works really well in one part of the world may not in another. 

Good interface design should ideally not only lightens the burden on users but can also complement security. Though remember this at times can be subjective and vary due to different design philosophy and tastes. Often, it has been assumed that security and usability always contradict each other — that security makes things more complicated, while usability aims to improve the user experience. If done right, they can support each other by defining constructive and destructive activities. Effective design should make constructive activities simple to perform while hindering destructive ones.

This can be achieved by incorporating security activities into the natural workflow of productive tasks, which requires the involvement of security and usability professionals early in the design process. Security and usability shouldn’t be extra features introduced as an afterthought once the system has been developed but an integral part of the design from the beginning. Further reading on secure-by-design / defaults if interested is encouraged and considered worthwhile.

Security professionals supported by usability 
professionals  can provide input into the design process via several methods such as iterative or participatory design. The iterative method consists of each development cycle being followed by testing and evaluation and the participatory method ensures that key stakeholders, including security professionals, have an opportunity to be involved.


Hopefully, this provides a quick read about cybersecurity and usability along with providing a couple of new ideas. This discussion is I think becoming more important and vital but still under-discussed. I have attended a couple of conferences dealing with the topic of cybersecurity and been the only speaker with a human factors/user perspective. As companies aim to stop themselves from being the victims of cyberattacks there is a risk that any measures they bring in will impede their employees or users thus they may seek ways to bias the hassle thus undermining the security measures in the first place. To avoid that situation requires input from all not just forced or imposed upon without consideration by implementors of those measures. 



Popular posts

Balancing functionality, usability and security in design

Personal Interest - Unbuilt fleets of the Royal Navy

Personal Interest - RAF Unbuilt Projects