BlogInspiration

How GDPR has changed the rules for design researchers

Animated illustration of a security camera.
Published
18 February 2020
Content
Lucy Denton

For a few short weeks in mid-2018, the online world fell into a curious alignment. There was a steady stream of messages through people's inboxes from every large seller, publisher, not-for-profit group, and social network they’d ever had dealings with. They were all promising that the public's trust was important to them and that they had an ongoing commitment to transparency. The future of data protection seemed to have arrived. It was compelling and it was everywhere.

The General Data Protection Regulation, a law of the European Union that came into effect on May 25th of that year, is a nerve-wracking prospect for most organizations. It empowers individuals to control what others do with their information. It is also designed to avoid issues with traditional land boundaries thrown up by the cross-border activities of the Internet.

As a result, the threat of steep fines for violations — up to €20-million or 4 percent of a company’s global annual revenues for severe breaches — caught the attention of a post-Cambridge Analytica world still obsessed with Big Data. Beyond the concerns about legal compliance also lurks something closer to home for businesses. The broad saturation and public awareness of the principles involved means that any breaches will measurably impact an organization’s public reputation. The net effect suggests that the GDPR is underway in becoming the de facto global standard for data privacy legislation and ethical management of personal information.

Why does this matter to design researchers?

Design researchers and UX practitioners routinely work with the kinds of personally identifying information (or PII) defined in the GDPR. The strict requirements and focus on lawful, fair, and transparent data processing have a direct impact on methodologies and artifacts we use. The two-year transition period leading to May 25, 2018, forced a rapid maturing of practices around sourcing and engaging research participants, gathering background information to help with interviews, and the management and secure storage of personal information.

The GDPR certainly wasn’t designed to impede ethical research. For example, the legislation is written with recognition that any type of data can be useful for research. It understands that research may require access to data for very long periods of time. It allows for personal data to be kept indefinitely, as long as it is only held for scientific, historical or statistical purposes, and on condition that legitimate justifications have been set out.

Why do we care so much about privacy?

The changes resulting from such a wide-reaching legal instrument also trigger an important cultural change in organizations that work in favor of design researchers. GDPR provides a huge incentive to businesses to take another look at how they interact with users. It will now be difficult for any large entity using personal data to not know what data it has, where it is held, and what it is doing with it.

The greatest changes are around implementing new transparency requirements and meeting the necessary safeguards, in particular where these don’t already reflect current good research practice.

Employers are liable for data protection breaches by their employees. But sensationalizing the fines, or highlighting difficulties in achieving compliance is probably a misunderstanding of how E.U. regulators work. The GDPR clearly spells out that companies that follow the regulations, avoid intentional infringement, take measures to mitigate any damage that does occur, and collaborate with authorities will reduce the likelihood of fines. Having said that, authorities are actively enforcing the regulation, and as of February 2020 have issued at least 168 fines totaling more than €458,000,000.

European Union? I’m miles away! Surely I’m not affected?

If you live and work in the E.U., it’s pretty clear that the GDPR applies to you. However, you must also comply with the GDPR if you offer goods or services to anyone in the E.U.(including citizens, permanent residents, or even tourists), regardless of your location or number of employees. As websites and mobile apps are a service, and I.P. addresses are considered personal data, you’ll need to comply unless you do something drastic like put a blanket block on anyone in the E.U. using your service so you don’t collect any information from them at all.

What kind of personal data is regulated?

The legislation’s official definition of personal data is intentionally broad and covers any data that can be used to identify someone, even indirectly. That includes the following kinds of personal data:

  • Direct and obvious things like names, ID numbers, email addresses, location information, bank details, physical description, health information.

  • Less direct or obvious things, like IP addresses, behavior and activities, social media content, spending habits, cultural background.

Even anonymized or pseudonymized data is regulated if it’s possible to connect the dots and identify a particular individual.

There are also some additional considerations for certain kinds of information that are considered extra sensitive and, in some cases, can’t even be waived by consent. Things like racial origins, political opinions, religion, biometrics, health information, sex life, and sexual orientation. If you’re going to work with any of this prohibited data, you should talk to a legal professional.

What is defined as ‘processing’?

Processing means ‘touching’ the data in any way. Article 4 defines it as:

Any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organization, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction.

That includes recording and storage of information, but also someone or something looking at that data. Scanning through a list of possible research participants to select those you’d like to speak with is, therefore, data processing. So is collating it in a spreadsheet, discussing it with your colleagues, and even inviting a participant to a study. The act of anonymizing is also classified as processing data for the purposes of the GDPR.

Upgrading your research practices

While the full GDPR requirements are pretty long and complicated, we can summarize them into a fairly typical design research process to help us see where things fit.

Before you start

Your first consideration needs to be getting proper legal advice. This should be either through your organization’s Data Protection Officer or a reputable lawyer with experience in the area of data protection and the GDPR.

Justify your need for any personal data you plan to collect. You can only collect personal data if you have legitimate justification. Generally, you’ll need informed consent and are expected to use the most limited justifications possible, as well as having a relevant business need. These justifications need to be specified before you collect the data, as you’ll be using them to request consent and to plan your data collection itself.

Get systems and processes ready. Prepare in advance to ensure they are ready to receive participant data and allow you to successfully deliver your data protection obligations. This includes asset storage, security, templates, consent forms, participant indexes, and ideally things like data retention policies.

During Recruitment

Allow prospective participants to opt-in, rather than opt-out. That means starting from scratch if you aren’t 100% clear about the current permissions of your lists and panels.

Tell participants explicitly why you’re collecting their data. This includes what you are doing with it, who you’re sharing it with, and how long you’re keeping it.

Send a copy of the consent form in advance. You might discuss it via phone before engaging participants, or require consent as a condition for booking an interview date. Or you might simply be giving them time to read it before attending and signing in person.

Consider the age of consent. Article 8 of the GDPR has specific requirements for informed consent for vulnerable people, such as children. Different countries have different rules about the age of consent that must be followed.

Long gone are the days of pre-checked form boxes and legal disclaimers buried in policies. The standard for informed consent in research involves conscientious discussion in person. This avoids a self-service, checklist approach where participants don’t give truly informed consent.

Use granular consent forms that list each piece of data to be collected and processed. This means having a checkbox beside each item so that people can choose what they agree to. This is where limiting third-parties, and only collecting what you need to know will help you — if it’s relevant to your explained purposes, and isn’t unnecessarily collected data that might end up blocking you from engaging the participant if they disagree, then you’re on your way.

Ensure participants understand who is doing the research. This means the organization benefiting from the research, as well as who is conducting it.

Explain the purpose of the research. How will their participation help the research aims? What is the thing that will be created or changed?

Be transparent about exactly who will see and handle the data. List every organization, including third parties, who will be involved in collecting or processing the data.

Be clear about how long their data will be retained. Often this is just for the period of research until the analysis has been completed, but some researchers set a blanket rule with a maximum of 1-2 years.

Get pre-approval for recording and observers. Get permission before pressing record on audio or video capture. Let people know if the session is being observed, and who is watching.

Avoid jargon. Consent forms must use plain language to ensure participants are fully aware of what they are agreeing to. Any language that is industry shorthand, or even merely topic-specific, may be obscure for some people.

Keep NDAs and other documents separate from consent forms. The aim is to keep things simple and prioritize informed consent. Keep in mind that comprehensively implementing the GDPR requirements around consent will likely produce a form with major usability issues. You’ll need to look for ways to simplify and clarify while retaining compliance.

Allow for withdrawal of consent. Explain that their participation is totally voluntary and that they can change their mind and withdraw permission, without needing to give a reason, and at any time, whether that is halfway through the interview, or a few months down the track.

Provide an independent channel for complaints. This might be by providing the contact details of the data controller, or of the national privacy authority.

Allow time for reflection. Hurrying, pressuring or encouraging people to gloss over things isn’t conducive to clear thinking. Encourage them to sit and consider things if they would like to.

During research

Only collect what you need to know. Article 5 in the GDPR stipulates that personal data shall be “adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed.” This means you should minimize the collection of unnecessary information.

After research

Keep a unified, protected index of participants. Having a single source of contact details means you don’t need to check multiple places for information to provide a copy of, correct, or delete details if participants make requests.

Anonymize your data. Process everything so it can’t be used to identify people, and so that it limits the ability to be traced back to the individual (you’ll likely need to pseudonymize file names so they can be tracked by the index). Blur faces in any photos or videos that will go beyond the design team. Obscure any identifiable information entered in on-screen forms in the videos. Anonymized quotes from an interview or report, that have no identifiable information, aren’t affected.

Make sure personal information is stored and processed securely. As the ‘controller’ determining the purposes and means of the processing, you are responsible for both how you handle it as well as anyone you pass it along to. You must vet any third-parties you’ll work with — tools, providers, partners, or anyone else. Think critically about which data needs to be passed on, when it needs to be transferred, and their own specific uses of the data in turn. Limit the number of third-parties used, and pay close attention to their privacy policies and security practices.

Give users control. One of the GDPR’s main goals is to provide control for individuals about how, where, and why their data is being used. These are enshrined in the rights of the data subject. Here’s a paraphrase in plain language:

  • You need to provide informed consent (see above).

  • If the participant asks, you must provide a copy of all the data you’ve collected from and about them.

  • If they ask, you need to erase all of their data. They have a ‘right to be forgotten’ that includes completely removing accounts and records unless there is a valid reason for it to be retained.

  • If they ask, you must stop processing and handling their data.

  • If they say that their data is inaccurate, you have to correct it.

  • If the person wants to move from your service to another service, you have to allow them to transfer their data out of your service in a machine-readable format.

  • People have a special right to object to their data being used for certain purposes such as direct marketing.

Organize your data for easy retrieval. In most cases, you have 30 days to process any of these user requests, so make it easy for yourself to accomplish them by staying organized. Standardize the way you label and name research data.

Practice data gardening. Consider deleting your raw collected data after full analysis, or after a retention period of one or two years. It can be hard for acquisitive researchers to let this go, but data should be deleted.

Keep personal information secure. This requirement also includes the need for proactive monitoring for breaches, keeping records of breaches, and notifying regulators within 72 hours of a breach. It goes as far as specifying regular testing of your security measures. Security needs to be considered as part of the design process of “appropriate technical and organizational measures” and not just bolted on afterward.


Most design researchers have aptitudes that complement the intentions of the GDPR, and working in roles that advocate for product users and customers shouldn’t make the application of responsible and transparent management techniques a problem. However, GDPR factors have an opportunity to be addressed comprehensively as organizations work towards improving their research operations.

Plus, the rules specified by these regulations are moving beyond legal compliance and becoming widely accepted as better and more ethical practices for handling data. With the United Kingdom planning to issue their own GDPR-like legislation post-Brexit, and other countries lining up to do the same, it is a good time to start planning ahead about how your research, and your organization, can better protect personal data.

By fulfilling the principles of the European Union’s data protection regulations, design researchers can advance the organizational conversation about respectfully engaging with customers and audiences.

Get started for free

Start freeView pricing

Keep reading

See all

Product

ChannelsMagicIntegrationsEnterpriseInsightsAnalysisPricingLog in

Company

About us
Careers13
Legal
© Dovetail Research Pty. Ltd.
TermsPrivacy Policy

Log in or sign up

Get started for free


or


By clicking “Continue with Google / Email” you agree to our User Terms of Service and Privacy Policy