Beyond Borders

Welcome to ELSA Tilburg's Blog 

on Law and Technology!

"This blog aims on creating a forum where law students can share their opinions

on current legal topics without the delay which typically comes with more formal publications" 

Miriam Rezaeian
Vice President for Academic Activities 2020/2021


(And who has a right to it?)

Laura Kata Szegi

During the last few decades, as neuroscience has been rapidly advancing, the idea of mind reading through technological tools have popped up in pop culture, one of the most well known examples being the 2002 movie, Minority Report. We have all seen or at least heard about the successful Netflix show, Black Mirror as well. In the episode titled Crocodile, the concept of an investigative technique that visualizes memories of individuals for another party to see is brought up, and even though the episode itself has been critiqued for not exploring this angle more deeply1, the question obviously arises: Who has a right to our mind? Are there any circumstances, for example a criminal investigation, that allow for our thoughts and memories to be monitored or retrieved by the authorities?

It is important to acknowledge that brain scanning technologies are not quite “there yet”. There is no way to draw and print out the perfect image of what someone is dreaming about - although advances are being made, and in some studies researchers were able to at least identify vaguely if the participant was, for example, dreaming about a car.2 Nevertheless, it is still not possible to put a suspect into a futuristic machine and read his memories of the incident in question in pursuit of the truth. However, as neuroscientist Jack Gallant has pointed out, it is worth starting to think about these issues so we can have a framework to work in when the technology finally arrives.3 As is well known, the procedures of implementing laws - or fitting a technology into an existing legal framework - works rather slowly, therefore the question of the legal nature of our thoughts from a criminal procedural perspective deserves to be discussed.

One of the most well known legal phrases, mostly thanks to crime dramas, is the right to remain silent. Strongly connected to this is the privilege against self-incrimination or nemo tenetur. Article 14(3)(g) of the ICCPR, to which 173 countries are party, specifically states that everyone should be entitled to “Not to be compelled to testify against himself or to confess guilt”4. Obviously, if the suspect does not want to incriminate himself, he has to have a right to remain silent, right? But what about your mind? Can you silence your thoughts? Does the prosecution (of the average Western jurisdiction) have a right to look at the images that were taken of your brain, if those images may contain information that you do not want to share?

In its 1996 decision on Saunders v UK the ECtHR stated that the privilege against self-incrimination “does not extend to the use in criminal proceedings of material which may be obtained from the accused through the use of compulsory powers but which has an existence independent of the will of the suspect such as, inter alia, documents acquired pursuant to a warrant, breath, blood and urine samples and bodily tissue for the purpose of DNA testing”5. Simply put, the court decided that if a specific piece of evidence, such as breath or DNA exists independently of the will of the suspect, then it can be obtained and used against him during the trial. Therefore, how we define our thoughts from an evidentiary perspective is the main question to answer. For the purposes of this article, I will not examine this from a philosophical perspective, although the question of free will and consciousness begs itself when trying to find an answer to whether our thoughts exist independently from us. Rather, it is worth looking into some comparisons. The aforementioned court decision also touches upon documents acquired through a warrant. If we look at the images or files of the brain, or however the future technology will present itself, then certainly the information on those documents, provided that they were obtained through a warrant, is legitimate and usable evidence according to the standards set by the ECtHR. However, the warrant themselves can only be obtained if the authorities are certain that such documents exist. However, it is safe to say that determining whether a certain piece of information in a person’s brain exists, is barely ever certain.

Another similar, and already existing tool is the polygraph, commonly known as the lie detector test. These tests have major similarities to the future brain scanning techniques in that they both intend to gain information about something that exists “inside the head” of the person undergoing it. It is a widely known fact that polygraphs are not fool proof however, therefore they are generally not admissible in court in Western societies. Furthermore, and connecting it to the concept of nemo tenetur, in most European countries polygraphs are seen as tools which violate the right to remain silent, and therefore courts have rejected its use, and policies have been enacted against it.6 It is hard to imagine therefore that brain scanning technologies with more significant potential and accuracy would be seen by courts as the way to go forward.

When measuring risks and gains, rights and wrongs, it is important to keep in mind that although science is an amazing tool which can be used to help the criminal justice systems of our societies, at the end of the day, the decision on guilt or innocence is in the hands of the judge or jury, not the neuroscientist.7 Therefore, instead of developing new legal approaches for the new waves of science to fit in, it might be more desirable to use our existing frameworks and approaches, already containing doctrines of fundamental principles, for the new technologies.8 If that is the way forward, then based on the stand Europe took on the polygraph, we might never see the day these brain scan techniques enter the Court of Law.


1Wilkinson A, “Black Mirror's ‘Crocodile’ Plumbs Our Memories and Drags out What Lurks in Them” (VoxDecember 29, 2017) <> accessed April 14, 2021

2Miller G, “Scientists Decode Dreams With Brain Scans” (WiredJune 4, 2017) <> accessed April 14, 2021

3Miller G, “Scientists Can't Read Your Mind With Brain Scans (Yet)” (Wired) <> accessed April 14, 2021

4UN General Assembly, International Covenant on Civil and Political Rights, 16 December 1966, United Nations art. 14(3)(g)

5Saunders v. The United Kingdom, 43/1994/490/672, Council of Europe: European Court of Human Rights, 17 December 1996, para. 69

6 Meijer, Ewout H; van Koppen, Peter J (2017). "Chapter 3. Lie Detectors and the Law: The Use of the Polygraph in Europe". In Canter, David; Žukauskiene, Rita (eds.). Psychology and Law : Bridging the Gap. Routledge. ISBN 9781351907873.

7Kraft, C. J., & Giordano, J. (2017). Integrating Brain Science and Law: Neuroscientific Evidence and Legal Perspectives on Protecting Individual Liberties. Frontiers in neuroscience, 11, 621.

8Ligthart, S., Douglas, T., Bublitz, C. et al. Forensic Brain-Reading and Mental Privacy in European Human Rights Law: Foundations and Challenges. Neuroethics (2020).



Satwik Singh

In this blog post I analyse the privacy policy1 of the Clubhouse app from the prism of the General Data Protection Regulation (GDPR)2 and the E-Privacy directive (Directive).3 After completing a rigorous course in Privacy and Data Protection it has become a hobby of mine to examine whether the privacy and cookie policies of different websites are in consonance with the principles of GDPR and other privacy regulations/directives. The initial aim behind this exercise was to ascertain my own understanding of the subject more than anything else, however soon I found out a disturbing trend. The trend was that most websites’ privacy and cookie policies were not complying with either the spirit or the letter of the GDPR and the Directive. It either means that even after some years of the GDPR now being in force, the real understanding of the provisions and how to comply with them is still at a nascent stage or the more disturbing conclusion that companies do not care enough about the personal data they process and are more interested in providing a mere lip service to the legislations and hopefully limit their liability.

Naturally the chatter about the huge privacy and cybersecurity risks about the new social media sensation Clubhouse caught my attention. For the uninitiated, Clubhouse is the newest kid on the block of social media and networking apps and has seen meteoric rise in its popularity and usage.

This combination of offering exclusive access and a very unique audio only usage drove this app very high on the popularity list. I had no great expectations when I started reading Clubhouse’s Privacy Policy however I have no hesitation in admitting that I was totally taken aback by the extent of noncompliance with the GDPR provisions. The subsequent section of the blog identifies the concerning parts of the privacy policy along with my explanation of why the said part is contrary to the provisions of the GDPR.

1. “By visiting Clubhouse's website(s) and all other products, services and applications made available by Clubhouse from time to time (collectively, the "Services"), you acknowledge that you accept the practices and policies outlined in this Privacy Policy. By using the Services, you are consenting to have your personal data transferred to and processed in the United States.”

This is part of the first paragraph of the Privacy Policy of Clubhouse app and it immediately raises a big red flag when seen in context of Data Processing under the GDPR and the E-Privacy Directive. Calling it just a big red flag is admittedly an understatement, this is because here the app has not just managed to breach a single article of the GDPR but an entire Chapter. Chapter 5 of the GDPR clearly deals with the various situations in which data can be processed or transferred outside the EU and in this case the United States. Long story short, Articles 45, 46, 47 and 49 basically provide the framework of how data can be transferred outside the territory of EU. The basic premise behind this chapter is that data can be transferred outside the EU if and only if it is ascertained that the third country has an adequate level of protection. The striking down of the EU-US Privacy shield in Schrems II is based on the adequacy principle discussed in this chapter. It is very interesting albeit quite disturbing as well to note that Clubhouse manages to completely ignore everything present in Chapter 5 of the GDPR in the very first paragraph of its privacy policy.

2. “Individuals from the European Union ("EU") may only use our Services after providing your freely given, informed consent for Clubhouse to collect, transfer, store, and share your Personal Data, as that term is defined in the EU's General Data Protection Regulation.”

In the very next paragraph, the Privacy Policy of Clubhouse seems to completely ignore the provisions of Article 5 (1) (b and c) of the GDPR which provides that specific and explicit purpose for which the data is being processed needs to be mentioned and in line with the principle of data minimization only that amount of data should be processed which is required for the purpose. From the text it is amply clear that the wording has been deliberately drafted vaguely, it is not clear for what purposes the consent for processing of Data is being collected and further it is also not taking a clear consent for such processing because the consent is not being freely taken, the wording clearly says “individuals from the EU may only use services…” a plethora of judgments exist on this issue wherein the Court have very strictly interpreted Article 7(4) of the GDPR to state that consent is freely given only when the provision of a service is not conditionally dependent on providing of consent. The explanation is that if a service is being denied to users if they don’t consent to the processing of their personal data, such a consent cannot be taken to construe a “valid consent”. The Privacy Policy clearly fails to take free consent for the processing of personal data.

3. “Certain information that is collected automatically, such as device ID, IP address and phone number, and browsing information that is associated with a user will be treated as Personal Information.”

Prima facie the wording is such that it in no uncertain terms accepts that users are subject to automatic collection and processing of personal data and as such this clearly attracts the application of Article 22(1)4 of the GDPR. The Privacy Policy fails to acknowledge this right of the data subject to choose to not subject himself or herself to this automatic processing of data. In addition to the aforesaid, Clubhouse also accepts that data may be vulnerable to external attacks and accepts no liability for the same.

While Clubhouse really offers a very unique service and may be really beneficial to new businesses by helping them make a brand name of themselves, it is a nightmare in terms of privacy risks. Therefore keeping in context the various risks, it may be advisable to resist the fear of missing out and the bandwagon effect and instead adopt the joy of missing out when it comes to using Clubhouse, an app which prima facie has no concerns about protecting personal data or breaching GDPR principles!


1 The Privacy Policy of Clubhouse app can be accessed at <> accessed on 20 March 2021

2 The General Data Protection Regulation can be accessed at <> accessed on 20 March 2021

3The E-Privacy Directive can be accessed at<> accessed on 20 March 2021

4 Article 22(1) of the GDPR reads as “The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling... or similarly significantly affects him or her".