2020: Standardization and Personalized Medicine Using Quantitative EEG in Clinical Settings (Plenary)

Presented by Andre Keizer, PhD: Two major trends have been dominant in healthcare in recent years. First, there is a growing consensus that standardization of healthcare procedures and methods can result in improved effectiveness and safety of treatments. There has been a great effort to implement more standardization in diagnosing and treating medical conditions. For example, the World Health Organization initiated the ‘High 5s project’ in 2007, which aimed to facilitate the development, implementation and evaluation of Standard Operating Protocols (SOPs; https://www.who.int/patientsafety/topics/high-5s/en/) for medical treatments in order to increase patient safety. Second, there is increased interest in ‘personalized medicine’, which refers to the tailoring of treatments to individual patients. Personalized medicine has gained considerable traction, mainly as a result of rapid developments in genetic research (Hamburg and Collins, 2010; Jameson and Longo, 2015) and novel methods for analysing ‘big data’ (Alyass et al., 2015). In the current presentation, I will discuss both standardization and personalized medicine in its historical context, how it is applied in different medical fields and most importantly, how these trends apply to the field of quantitative EEG (qEEG). There are two important topics that relate to the use of qEEG in clinical practise that need to be addressed in the context of standardization. The first topic relates to the way resting state EEG data is de-artifacted. Traditionally, EEGs are de-artifacted manually, by trained EEG technicians or EEG researchers. However, in recent years standardized and automated de-artifacting procedures are increasingly being used in scientific research and in clinical practise. The advantages of these procedures over manual de-artifacting will be discussed. The second topic relates to the use of normative databases in order to assess clinically meaningful deviations of a patient’s EEG. The results of a systematic comparison between two commonly-used qEEG databases show that these databases produce very comparable results, illustrating not only the validity and reliability of both databases, but also the opportunity to move forward to a standardized use of qEEG in clinical practise. The standardization of qEEG analyses enables valid and reliable use of qEEG for diagnostic procedures, to guide personalized treatment protocols and to assess treatment effectiveness. Finally, the standardization of qEEG interpretation as both a diagnostic and treatment selection tool provides an example of how qEEG can merge both personalized medicine and standardization in the treatment of psychological disorders.

Category:

$30.00

Presented by Andre Keizer, PhD: Two major trends have been dominant in healthcare in recent years. First, there is a growing consensus that standardization of healthcare procedures and methods can result in improved effectiveness and safety of treatments. There has been a great effort to implement more standardization in diagnosing and treating medical conditions. For example, the World Health Organization initiated the ‘High 5s project’ in 2007, which aimed to facilitate the development, implementation and evaluation of Standard Operating Protocols (SOPs; https://www.who.int/patientsafety/topics/high-5s/en/) for medical treatments in order to increase patient safety. Second, there is increased interest in ‘personalized medicine’, which refers to the tailoring of treatments to individual patients. Personalized medicine has gained considerable traction, mainly as a result of rapid developments in genetic research (Hamburg and Collins, 2010; Jameson and Longo, 2015) and novel methods for analysing ‘big data’ (Alyass et al., 2015). In the current presentation, I will discuss both standardization and personalized medicine in its historical context, how it is applied in different medical fields and most importantly, how these trends apply to the field of quantitative EEG (qEEG). There are two important topics that relate to the use of qEEG in clinical practise that need to be addressed in the context of standardization. The first topic relates to the way resting state EEG data is de-artifacted. Traditionally, EEGs are de-artifacted manually, by trained EEG technicians or EEG researchers. However, in recent years standardized and automated de-artifacting procedures are increasingly being used in scientific research and in clinical practise. The advantages of these procedures over manual de-artifacting will be discussed. The second topic relates to the use of normative databases in order to assess clinically meaningful deviations of a patient’s EEG. The results of a systematic comparison between two commonly-used qEEG databases show that these databases produce very comparable results, illustrating not only the validity and reliability of both databases, but also the opportunity to move forward to a standardized use of qEEG in clinical practise. The standardization of qEEG analyses enables valid and reliable use of qEEG for diagnostic procedures, to guide personalized treatment protocols and to assess treatment effectiveness. Finally, the standardization of qEEG interpretation as both a diagnostic and treatment selection tool provides an example of how qEEG can merge both personalized medicine and standardization in the treatment of psychological disorders.

2020: Standardization and Personalized Medicine Using Quantitative EEG in Clinical Settings (Plenary)
Scroll to Top