Percent
Row Pct
Col Pct
2249
87.99
96.11
92.21
\(P_{0}=91 / 2340=3.96 \%\)
\(\mathrm{P}_{1}=26 / 216=12.04 \%\)
Prevalence Ratio:
\(PR=P_{1} / P_{0}=12.0 / 3.9=3.10\)
Odds ratio \(= (2249 \times 26] /[91 \times 190]=3.38\)
'0' indicates those who do not have coronary heart disease, '1' is for those with coronary heart disease; similarly for diabetes, '0' is the absence, and '1' the presence of diabetes.
The prevalence of coronary heart disease among people without diabetes is 91 divided by 2340, or 3.9% of all people with diabetes have coronary heart disease. Similarly the prevalence among those with diabetes is 12.04%. Our prevalence ratio, considering whether diabetes is a risk factor for coronary heart disease is 12.04 / 3.9 = 3.1. The prevalence of coronary heart disease in people with diabetes is 3.1 times as great as it is in people without diabetes.
We can also use the 2 x 2 table to calculate an odds ratio as shown above:
( 2249 × 26) / ( 91 × 190) = 3.38
The odds of having diabetes among those with coronary heart disease is 3.38 times as high as the odds of having diabetes among those who do not have coronary heart disease.
Which of these do you use? They come up with slightly different estimates.
It depends upon your primary purpose. Is your purpose to compare prevalences? Or, do you wish to address the odds of dibetes as related to coronary health status?
Now, let's add hypertension as a potential confounder.
Ask: "Is hypertension a risk factor for CHD (among non-diabetics)?"
First of all, prior knowledge tells us that hypertension is related to many heart related diseases. Prior knowledge is an important first step but let's test this with data.
We consider the 2 × 2 table below:
HYPERT (Hypertension) | |||
---|---|---|---|
Frequency Percent Row Pct Col Pct | CHD (PREVALENT CORONARY HEART DISEASE) | ||
0 | 1 | Total | |
0 | 1572 | 51 2.19 56.67 | 1623 69.63 |
1 | 669 28.70 94.49 29.85 | 39 1.67 43.33 | 708 30.37 |
Total | 2241 96.14 | 90 3.86 | 2331 100.00 |
Statistic | DF | Value | Prob |
---|---|---|---|
Chi-square | 1 | 7.435 | 0.006 |
Likelihood Ratio Chi-square | 1 | 6.998 | 0.008 |
Continuity Adj. Chi-square | 1 | 6.811 | 0.009 |
Mantel- Haenszel Chi-square | 1 | 7.432 | 0.006 |
Fisher's Exact Test (Left) | 0.997 | ||
Fisher's Exact Test (Right) | 5.45E-03 | ||
Fisher's Exact Test (2-Tail) | 9.66E-03 | ||
Phi Coefficient | 0.056 | ||
Contingency Coefficient | 0.056 | ||
Cramer's V | 0.056 | ||
Effective Sample Size = 2331 |
We are evaluating the relationship of CHD to hypertension in non-diabetics. You can calculate the prevalence ratios and odds ratios as suits your purpose.
These data show that there is a positive relationship between hypertension and CHD in non-diabetics. (note the small p-values)
This leads us to our next question, "Is diabetes (exposure) associated with hypertension?"
We can answer this with our data as well (below):
HYPERT (Hypertension) | |||
---|---|---|---|
Frequency Percent Row Pct Col Pct | DIABETES (Diabetes) | Total | |
0 | 1 | ||
0 | 1650 | 85 3.28 38.46 | 1735 66.94 |
1 | 721 27.82 84.13 30.41 | 136 5.25 61.54 | 857 33.06 |
Total | 2371 91.47 | 221 8.53 | 2592 100.00 |
Statistic | DF | Value | Prob |
---|---|---|---|
Chi-square | 1 | 88.515 | 0.001 |
Likelihood Ratio Chi-square | 1 | 82.438 | 0.001 |
Continuity Adj. Chi-square | 1 | 87.114 | 0.001 |
Mantel- Haenszel Chi-square | 1 | 88.481 | 0.001 |
Fisher's Exact Test (Left) | 1.000 | ||
Fisher's Exact Test (Right) | 1.01E-19 | ||
Fisher's Exact Test (2-Tail) | 1.79E-19 |
Again, the results are highly significant! Therefore, our first two criteria have been met for hypertension as a confounder in the relationship between diabetes and coronary heart disease.
A final question, "Is hypertension an intermediate pathway between diabetes (exposure) and development of CHD?" – or, vice versa, does diabetes cause hypertension which then causes coronary heart disease? Based on biology, that is not the case. Diabetes in and of itself can cause coronary heart disease. Using the data and our prior knowledge, we conclude that hypertension is a major confounder in the diabetes-CHD relationship.
What do we do now that we know that hypertension is a confounder?
Stratify....let's consider some stratified assessments...
Stratification and adjustment - diabetes and chd relationship confounded by hypertension:.
Earlier we arrived at a crude odds ratio of 3.38.
Diabetes | CHD | Total | |
---|---|---|---|
Yes | No | ||
Yes | 26 | 190 | 216 |
No | 91 | 2249 | 2340 |
Total | 117 | 2439 | 2556 |
\(OR_{\text {crude }}=(26 \times 2249) /(91 \times 190)=3.38\) |
Now we will use an extended Maentel Hanzel method to adjust for hypertension and produce an adjusted odds ratio When we do so, the adjusted OR = 2.84.
The Mantel-Haenszel method takes into account the effect of the strata, presence or absence of hypertension.
If we limit the analysis to normotensives we get an odds ratio of 2.4.
Diabetes | CHD | Total | |
---|---|---|---|
Yes | No | ||
Yes | 6 | 77 | 83 |
No | 51 | 1572 | 1623 |
Total | 57 | 1649 | 1706 |
\(OR_{\text {HYP-NO }}=(6 \times 1572) /(77 \times 51)=2.40\) |
Among hypertensives, we get an odds ratio of 3.04.
Diabetes | CHD | Total | |
---|---|---|---|
Yes | No | ||
Yes | 20 | 113 | 133 |
No | 39 | 669 | 708 |
Total | 59 | 782 | 841 |
\(OR_{\text {HYP-YES }}=(20 \times 669) /(39 \times 113)=3.04\) |
Both estimates of the odds ratio are lower than the odds ratio based on the entire sample. If you stratify a sample, without losing any data, wouldn't you expect to find the crude odds ratio to be a weighted average of the stratified odds ratios?
This is an example of confounding - the stratified results are both on the same side of the crude odds ratio. This is positive confounding because the unstratified estimate is biased away from the null hypothesis. The null is 1.0. The true odds ratio, accounting for the effect of hypertension, is 2.8 from the Maentel Hanzel test. The crude odds ratio of 3.38 was biased away from the null of 1.0. (In some studies you are looking for a positive association; in others, a negative association, a protective effect; either way, differing from the null of 1.0)
This is one way to demonstrate the presence of confounding. You may have a priori knowledge of confounded effects, or you may examine the data and determine whether confounding exists. Either way, when confounding is present, as, in this example, the adjusted odds ratio should be reported. In this example, we report the odds ratio for the association of diabetes with CHD = 2.84, adjusted for hypertension.
If you are analyzing data using multivariable logistic regression, a rule of thumb is if the odds ratio changes by 10% or more, include the potential confounder in the multi-variable model. The question is not so much the statistical significance, but the amount of the confounding variable changes the effect. If a variable changes the effect by 10% or more, then we consider it a confounder and leave it in the model.
Controlling potential confounding starts with a good study design including anticipating potential confounders.
In the previous example, we saw both stratum-specific estimates of the odds ratio went to one side of the crude odds ratio. With effect modification, we expect the crude odds ratio to be between the estimates of the odds ratio for the stratum-specific estimates.
Consider the following examples:
Why study effect modification why do we care.
If you do not identify and handle properly an effect modifier, you will get an incorrect crude estimate. The (incorrect) crude estimator (e.g., RR, OR) is a weighted average of the (correct) stratum-specific estimators. If you do not sort out the stratum-specific results, you miss an opportunity to understand the biologic or psychosocial nature of the relationship between risk factors and outcome.
To consider effect modification in the design and conduct of a study:
To consider effect modification in the analysis of data:
When you combine men and women the crude odds ratio = 4.30.
Diabetes (Diabetes) | |||
---|---|---|---|
Frequency Percent Row Pct Col Pct | Incident CHD | Total | |
0 | 1 | ||
0 | 1191 | 25 | 1216 |
1 | 93 | 13 | 106 |
Total | 1248 | 38 | 1322 |
\(Cumulative \ Incidence_{0} \\\ = \\ 25/1219 \ = \\\ 2.05 %\) \(Cumulative \ Incidence_{1} \\\ = \\ 13/106 \ = \\\ 12.26 %\) \(Relative \ Risk \\\ = \\\ 12.26/2.05 = 5.98\) \(Odds \ ratio = (1191*13)/(25*93) = 6.66\) |
Diabetes (Diabetes) | |||
---|---|---|---|
Frequency Percent Row Pct Col Pct | Incident CHD | Total | |
0 | 1 | ||
0 | 1003 | 70 | 1073 |
1 | 77 | 12 | 89 |
Total | 1080 | 82 | 1162 |
\(CI_{0} \\ = \\ 6.52 %\) \(CI_{1} \\ = \\ 13.48 %\) \(RR \\ = \\ 2.07\) \(Odds \ ratio \\ = \\ 2.23\) |
Stratifying by gender, we can calculate different measures. Look at the odds ratios above. The odds ratio for women is 6.66, compared to the crude odds ratio of 4.30. Therefore, women are at much greater risk of diabetes leading to incident coronary heart disease. For men, the odds ratio is 2.23.
Is diabetes a risk for incident heart disease in men and in women? Yes. Is it the same level of risk? No. For men, the OR is 2.23, for women it is 6.66. The overall estimate is closer to a weighted average of the two stratum-specific estimates. Gender modifies the effect of diabetes on incident heart disease. We can see that numerically because the crude odds ratio is more representative of a weighted average of the two groups.
What is the most informative estimate of the risk of diabetes for heart disease? 4.30 is not very informative of the true relationship. What is much more informative is to present the stratum-specified analysis.
During data analysis, major confounders and effect modifiers can be identified by comparing stratified results to overall results.
In summary, the process is as follows:
To review, confounders mask a true effect, and effect modifiers mean that there is a different effect for different groups.
You have reached the end of the reading material for Week 3!!! Go to the Week 3 activities in Canvas.
Getty Images / Andrew Brookes
Confounding variables are external factors (typically a third variable) in research that can interfere with the relationship between dependent and independent variables .
A confounding variable alters the risk of the condition being studied and confuses the “true” relationship between the variables. The role of confounding variables in research is critical to understanding the causes of all kinds of physical, mental, and behavioral conditions and phenomena.
Typical examples of confounding variables often relate to demographics and social and economic outcomes.
For instance, people who are relatively low in socioeconomic status during childhood tend to do, on average, worse financially than others do when they reach adulthood, explains Glenn Geher , PhD, professor of psychology at State University of New York at New Paltz and author of “Own Your Psychology Major!” While he said we could simply think this because poverty begets poverty, he also says there are other variables that are conflated with poverty.
People with lower economic means tend to have less access to high quality education, which is also related to fiscal success in adulthood, Geher explained. Furthermore, poverty is often associated with limited access to healthcare and, thus, with increased risk of adverse health outcomes. These factors can also play roles in fiscal success in adulthood.
“The bottom line here is that when looking to find factors that predict adult economic success, there are many variables that predict this outcome, and so many of these factors are confounded with one another,” Geher said.
Psychology researchers must be diligent in controlling for confounding variables, because if they are not, they may draw inaccurate conclusions.
For example, during a research project, Geher’s team found the number of stitches one received in childhood predicted one’s sexual activity in adulthood.
However, Geher said "to conclude that getting stitches causes promiscuous behavior would be unwarranted and odd. In fact, it is much more likely that childhood health outcomes, such as getting stitches, predicts environmental instability during childhood, which has been found to indirectly bear on adult sexual and relationship outcomes,” said Geher.
In other words, the number of stitches is confounded with environmental instability in childhood. It's not that the number of stitches is directly correlated with sexual activity.
Another example that shows confounding variables is the idea that there is a positive correlation between ice cream sales and homicide rates. However, in fact, both these variables are confounded with time of year, said Geher. “They are both higher in summer when days are longer, days are hotter, and people are more likely to encounter others in social contexts because in the winter when it is cold people are more likely to stay home—so they are less likely to buy ice cream cones and to kill others,” he said.
Both of these are examples of how it is in the best interest of researchers to ensure that they control for confounding variables to increase the likelihood that their conclusions are truly warranted.
Universal confounding variables across research on a particular topic can also be influential. In an evaluation of confounding variables that assessed the effect of alcohol consumption on the risk of ischemic heart disease, researchers found a large variation in the confounders considered across observational studies.
While 85 of 87 studies that the researchers analyzed made a connection to alcohol and ischemic heart disease, confounding variables that could influence ischemic heart disease included, smoking, age, and BMI, height, and/or weight. This means that these factors could have also affected heart disease, not just alcohol.
While most studies mentioned or alluded to “confounding” in their Abstract or Discussion sections, only one stated that their main findings were likely to be affected by confounding variables. The authors concluded that almost all studies ignored or eventually dismissed confounding variables in their conclusions.
Because study results and interpretations may be affected by the mix of potential confounders included within models, the researchers suggest that “efforts are necessary to standardize approaches for selecting and accounting for confounders in observational studies.”
The best way to control for confounding variables is to conduct “true experimental research,” which means researchers experimentally manipulate a variable that they think causes a certain outcome. They typically do this by randomly assigning study participants to different levels of the first variable, which is referred to as the “independent variable.”
For example, if researchers want to determine if, separate from other factors, receiving a full high-quality education, including a four-year college degree from a respected school, causes positive fiscal outcomes in adulthood, they would need to find a pool of participants, such as a group of young adults from the same broad socioeconomic group as one another. Once the group is selected, half of them would need to be randomly assigned to receive a free, high-quality education and the other half would need to be randomly assigned to not receive such an education.
“This methodology would allow you to see if there are fiscal outcomes on average for the two groups later in life and, if so, you could reasonably conclude that the cause of the differential fiscal outcomes is found in the educational differences across the two groups,” said Geher. “You can draw this conclusion because you randomly assigned the participants to these different groups—and process that naturally controls for confounding variables.”
However, with this process, different problems emerge. For instance, it would not be ethical or practical to randomly assign some participants to a “high-quality education” group and others to a “no-education” group.
“[Controlling] confounding variables via experimental manipulation is not always feasible,” Geher said.
Because of this, there are also statistical ways to try to control for confounding variables, such as “partial correlation,” which looks at a correlation between two variables (e.g., childhood SES and adulthood SES) while factoring out the effects of a potential confounding variable (e.g., educational attainment).
However, statistical control that addresses confounding by measurement can point to confounding through inappropriate control.
“This statistically oriented process is definitely not considered the gold standard compared with true experimental procedures, but often, it is the best you can do given ethical and/or practical constraints,” said Geher.
Controlling for confounding variables is critical in research primarily because it allows researchers to make sure that they are drawing valid and accurate conclusions.
“If you don’t correct for confounding variables, you put yourself at risk for drawing conclusions regarding relationships between variables that are simply wrong (at the worst) or incomplete (at the best),” said Geher.
Controlling for confounding variables includes a basic set of skills when it comes to the social and behavioral sciences, he added.
Human behavior is highly complex and any single action often has a broad array of variables that underlie it.
“Understanding the concept of confounding variables, as well as how to control for these variables, makes for better behavioral science with conclusions that are, simply, more valid that research that does not effectively take confounding variables into account,” Geher said.
Wallach JD, Serghiou S, Chu L, et al. Evaluation of confounding in epidemiologic studies assessing alcohol consumption on the risk of ischemic heart disease. BMC Med Res Methodol. 2020;20(1):64. https://doi.org/10.1186/s12874-020-0914-6
Pourhoseingholi MA, Baghestani AR, Vahedi M. How to control confounding effects by statistical analysis. Gastroenterol Hepatol Bed Bench. 2012;5(2):79-83. https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4017459/
By Cathy Cassata Cathy Cassata is a freelance writer who specializes in stories around health, mental health, medical news, and inspirational people.
484 Accesses
Confounding factor ; Confounding variable
An (experimental) confound is a factor affecting both the dependent and the independent variables systematically, thus being responsible for (at least part of) their statistical relationship.
In quantitative psychological investigations, a researcher tries to discover statistical relationships between variables. This relationship is commonly quantified in terms of covariation in a statistical model. It is impossible to include all variables in the model, so any relationship revealed by the model may be caused or influenced by a variable that is not considered in the model. This variable responsible for the spurious relationship is called “confound.”
An empirical researcher conducting an investigation typically analyzes the relationship between dependent and independent variables using statistical models. These models have to be formulated including all variables...
This is a preview of subscription content, log in via an institution to check access.
Subscribe and save.
Tax calculation will be finalised at checkout
Purchases are for personal use only
Institutional subscriptions
Authors and affiliations.
Department of Psychology, Psychological Methods and Assessment, Münich, Germany
Sven Hilbert
Faculty of Psychology, Educational Science, and Sport Science, University of Regensburg, Regensburg, Germany
Psychological Methods and Assessment, LMU Munich, Munich, Germany
You can also search for this author in PubMed Google Scholar
Correspondence to Sven Hilbert .
Editors and affiliations.
Oakland University, Rochester, MI, USA
Virgil Zeigler-Hill
Todd K. Shackelford
Humboldt University, Germany, Berlin, Germany
Matthias Ziegler
Reprints and permissions
© 2020 Springer Nature Switzerland AG
Cite this entry.
Hilbert, S. (2020). Confound (Experimental). In: Zeigler-Hill, V., Shackelford, T.K. (eds) Encyclopedia of Personality and Individual Differences. Springer, Cham. https://doi.org/10.1007/978-3-319-24612-3_1286
DOI : https://doi.org/10.1007/978-3-319-24612-3_1286
Published : 22 April 2020
Publisher Name : Springer, Cham
Print ISBN : 978-3-319-24610-9
Online ISBN : 978-3-319-24612-3
eBook Packages : Behavioral Science and Psychology Reference Module Humanities and Social Sciences Reference Module Business, Economics and Social Sciences
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
Policies and ethics
Crossover studies, experimental study of populations.
Content links.
A confounding variable is a variable that influences both the independent variable and dependent variable and leads to a false correlation between them. A confounding variable is also called a confounder, confounding factor, or lurking variable. Because confounding variables often exist in experiments, correlation does not mean causation. In other words, when you see a change in the independent variable and a change in the dependent variable, you can’t be certain the two variables are related.
Here are examples of confounding variables, a look at the difference between a confounder and a mediator, and ways to reduce the risk of confounding variables leading to incorrect conclusions.
Sometimes confounding points to a false cause-and-effect relationship, while other times it masks a true effect.
Correlation does not imply causation. If you’re unconvinced, check out the spurious correlations compiled by Tyler Vigen.
The first step to reduce the risk of confounding variables affecting your experiment is to try to identify anything that might affect the study. It’s a good idea to check the literature or at least ask other researchers about confounders. Otherwise, you’re likely to find out about them during peer review!
When you design an experiment, consider these techniques for reducing the effect of confounding variables:
A confounder affects both the independent and dependent variables. In contrast, a mediator or effect modifier does not affect the independent variable, but does modify the effect the independent variable has on the dependent variable. For example, in a test of drug effectiveness, the drug may be more effective in children than adults. In this case, age is an effect modifier. Age doesn’t affect the drug itself, so it is not a confounder.
In a way, a confounding variable results in bias in that it distorts the outcome of an experiment. However, bias usually refers to a type of systematic error from experimental design, data collection, or data analysis. An experiment can contain bias without being affected by a confounding variable.
Confounding Variable: A factor that affects both the independent and dependent variables, leading to a false association between them. Effect Modifier: A variable that positively or negatively modifies the the effect of the independent variable on the dependent variable. Bias: A systematic error that masks the true effect of the independent variable on the dependent variable.
Stack Exchange network consists of 183 Q&A communities including Stack Overflow , the largest, most trusted online community for developers to learn, share their knowledge, and build their careers.
Q&A for work
Connect and share knowledge within a single location that is structured and easy to search.
Are there any well-known statistical studies that were originally published and thought to be valid, but later had to be thrown out due to a confounding variable that wasn't taken into account? I'm looking for something easy to understand that could be explained to and appreciated by a quantitative literacy class that has zero pre-requisites.
Coffee drinking & lung cancer.
My favorite example is that supposedly, "coffee drinkers have a greater risk of lung cancer", despite most coffee drinkers... well... drinking coffee, rather than inhaling it.
There have been various studies about this, but the consensus remains that studies with this conclusion usually just have a larger proportion of smoking coffee drinkers, than non-smoking coffee drinkers. In other words, the effect of smoking confounds the effect of coffee consumption , if not included in the model. The most recent article on this I could find is a meta analysis by Vania Galarraga and Paolo Boffetta (2016). $^\dagger$
Another example that plagues clinical research, is the claim that obesity can be beneficial for certain diseases. Specifically, many articles, still to this day (just do a quick search for obesity paradox on pubmed and be amazed), claim the following:
Why does this happen? Obesity is defined as excess fat negatively affecting health, yet we classify obesity based on BMI. BMI is just calculated as:
$$\text{BMI} = \frac{\text{weight in kg}}{(\text{height in m})^2},$$
so the most direct way to combat obesity is through weight loss (or by growing taller somehow).
Regimes that focus on loss of weight rather than fat , tend to result in a proportionally large loss of muscle. This is likely what causes lower BMI to be associated with a higher rate of major adversarial events.
Because many studies do not include measures of body fat (percentage), but only BMI as a proxy, the amount of body fat confounds the effect of BMI on health.
A nice review of this phenomenon was written by Steven G. Chrysant (2018). $^\ddagger$ He ends with:
[B]ased on the recent evidence, the obesity paradox is a misnomer and could convey the wrong message to the general public that obesity is not bad.
Followed by:
Journals [should] no longer accept articles about the 'obesity paradox'.
$\dagger$ : Vania Galarraga and Paolo Boffetta (2016): Coffee Drinking and Risk of Lung Cancer—A Meta-Analysis. Cancer Epidemiol Biomarkers Prev June 1 2016 (25) (6) 951-957; DOI: 10.1158/1055-9965.EPI-15-0727
$\ddagger$ : Steven G. Chrysant (2018): Obesity is bad regardless of the obesity paradox for hypertension and heart disease. J Clin Hypertens (Greenwich). 2018 May;20(5):842-846. doi: 10.1111/jch.13281. Epub 2018 Apr 17.
Examples of (poor) studies claiming to have demonstrated the obesity paradox:
Articles refuting the obesity paradox as a mere confounding effect of body fat:
Articles about the obesity paradox in cancer:
You might want to introduce Simpson's Paradox .
The first example that page is the UC Berkeley gender bias case where it was thought that there was gender bias (towards males) in admissions when looking at overall acceptance rates, but this was eliminated or reversed when investigated by department . The confounding variable of department picked up on a gender difference in applying to more competetive departments.
After an initial study finding a link between living next to high-voltage transmission lines and cancer, follow-up studies found that when you include income in the model the effect of the power lines goes away.
Living next to power lines is a moderately accurate predictor of low household income / wealth. Put bluntly, there aren't as many fancy mansions next to transmission lines as elsewhere.
There is correlation between poverty and cancer. When comparisons were made between households on similar income brackets close to and far away from transmission lines, the effect of transmission lines disappeared.
In this case, the confounding variables were household wealth and distance to the nearest high voltage line.
Background reading .
Consider the following examples. I am not sure they are necessarily very famous but they help to demonstrate the potential negative effects of confounding variables.
Say one is studying the relation between birth order (1st child, 2nd child, etc.) and the presence of Down Syndrome in the child. In this scenario, maternal age would be a confounding variable:
Higher maternal age is directly associated with Down Syndrome in the child
Higher maternal age is directly associated with Down Syndrome, regardless of birth order (a mother having her 1st vs 3rd child at age 50 confers the same risk)
Maternal age is directly associated with birth order (the 2nd child, except in the case of twins, is born when the mother is older than she was for the birth of the 1st child)
Maternal age is not a consequence of birth order (having a 2nd child does not change the mother's age)
More examples
In risk assessments, factors such as age, gender, and educational levels often affect health status and so should be controlled. Beyond these factors, researchers may not consider or have access to data on other causal factors. An example is the study of smoking tobacco on human health. Smoking, drinking alcohol, and diet are lifestyle activities that are related. A risk assessment that looks at the effects of smoking but does not control for alcohol consumption or diet may overestimate the risk of smoking (Tjønneland, Grønbaek, Stripp, & Overvad, 1999). Smoking and confounding are reviewed in occupational risk assessments such as the safety of coal mining (Axelson, 1989). When there is not a large sample population of non-smokers or non-drinkers in a particular occupation, the risk assessment may be biased towards finding a negative effect on health.
References: https://en.wikipedia.org/wiki/Confounding
Tjønneland, A., Grønbaek, M., Stripp, C., & Overvad, K. (1999). Wine intake and diet in a random sample of 48763 Danish men and women. The American Journal of Clinical Nutrition, 69(1), 49-54.
Axelson, O. (1989). Confounding from smoking in occupational epidemiology. British Journal of Industrial Medicine, 46(8), 505-507.
There was one about diet that looked at diet in different countries and concluded that meat caused all sorts of problems (e.g. heart disease), but failed to account for the average lifespan in each country: The countries that ate very little meat also had lower life expectancies and the problems that meat "caused" were ones that were linked to age.
I don't have citations for this - I read about it about 25 years ago - but maybe someone will remember or maybe you can find it.
I'm not sure it entirely counts as a confounding variable so much as confounding situations , but animals' abilities to find their way through a maze may qualify.
As described in this ScienceDirect summary , studies of rats (or other animals) in mazes were popular for a large part of the 20th century, and continue today to some extent. One possible purpose is to study the subject's ability to remember a maze which it has previously run; another popular purpose is to study any bias in the subject's choices of whether to turn left or right at junctions, in a maze which the subject has not previously run.
It should be immediately clear that if the subject has forgotten the maze, then any inherent bias in choice of route will be a confounding factor. If the "right" direction coincides with the subject's bias, then they could find their way in spite of not remembering the route.
In addition to this, studies found various other confounding features exist which might not have been considered. The height of walls and width of passages are factors, for example. And if another subject has previously navigated the maze, subjects which rely strongly on their sense of smell (mice and dogs, for instance) may find their way simply by tracking the previous subject's scent. Even the construction of the maze may be an issue - animals tend to be less happy to run over "hollow-sounding" floors.
Many animal maze studies ended up finding confounding factors instead of the intended study results. More disturbingly, according to Richard Feynmann , the studies reporting these confounding factors were not picked up by researchers at the time. As a result we simply don't know if any animal maze studies carried out around this time have any validity whatsoever. That's decades worth of high-end research at the finest universities around the world, by the finest psychologists and animal behaviourists, and every last shred of work had to at best be taken with a very large spoon of salt. Later researchers had to go back and duplicate all this work, to find out what was actually valid and what wasn't repeatable.
There was a great study of mobile phone use and brain cancer. Most people with a lateral brain cancer, when asked which hand they hold their phone in, answer the diseased side. This seemed to show that phone use caused cancer.
However, maybe the answers are informed by hindsight. Someone thought of a great test for this. The sample was big enough to include some people with two cancers. So you could ask, does the declared side of phone use influence the risk of a cancer on the other side of the brain? It was actually protective , thus showing the hindsight bias in the original result.
Sorry, I don't have the reference.
'Statistics' by Freedman, Purvis et al. has a number of examples in the first couple of chapters. My personal favorite is that ice cream causes polio. The confounding variable is that they are both prevalent in the summertime when young children are out, about, and spreading polio. The book is "Statistics (Fourth Edition) 4th Edition, Kindle Edition- by David Freedman (Author), Robert Pisani (Author), Roger Purves (Author)"
This is not a study, but a gallery of spurious correlations that could be appreciated by a quantitative literacy class. The downside of this is the lack of an explanation (aside from chance).
See: Subversive Subjects: Rule-Breaking and Deception in Clinical Trials https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4520402/
Hormone replacement Therapy and heart disease?
https://www.teachepi.org/wp-content/uploads/OldTE/documents/courses/bfiles/The%20B%20Files_File1_HRT_Final_Complete.pdf
The benefits were determined by observation, and essentially it appears that the people who chose to do hrt had higher socioeconomic status, healthier lifestyle etc
(So one could argue on confounding Vs observational study)
There are lots of good examples in Howard Weiner's books. In particular, Chapter 1 "The most dangerous equation" in "How to understand, communicate and control uncertainty through graphical display"
Examples include:
The small schools movement. People noticed that some small schools had better performance than large schools so spent money to reduce school size. It turned out that some small schools also had worse performance than large schools. It was largely an artefact of extreme outcomes showing up in small samples.
Kidney cancer rates (This example is also used Daniel Kahneman's "Thinking Fast and Slow", see the start of Chapter 10). Lowest kidney cancer rates in rural, sparsely populated counties. These low rates have to be because of the the clean living rural life style. But wait, counties with the highest incidence of kidney cancer are also rural and sparsely populated. This has to be because of the lack of access to good medical care and too much drinking. Of course, the extreme are actually an artefact of the small populations.
Sign up or log in, post as a guest.
Required, but never shown
By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy .
Earlier we wrote about different kinds of variables . In short, dependent variables are what you get (outcomes), independent variables are what you set, and extraneous variables are what you can’t forget (to account for).
When you measure a user experience using metrics—for example, the SUPR-Q, SUS, SEQ, or completion rate—and conclude that one website or product design is good, how do you know it’s really the design that is good and not something else? While it could be due to the design, it could also be that extraneous (or nuisance) variables, such as prior experiences, brand attitudes, and recruiting practices, are confounding your findings.
A critical skill when reviewing UX research findings and published research is the ability to identify when the experimental design is confounded .
Confounding can happen when there are variables in play that the design does not control and can also happen when there is insufficient control of an independent variable.
There are numerous strategies for dealing with confounding that are outside the scope of this article. In fact, it’s a topic that covers several years of graduate work in disciplines such as experimental psychology.
Our goal in this first of a series of articles is to show how to identify a specific type of confounded design in published experiments and demonstrate how their data can be reinterpreted once you’ve identified the confounding.
One of the great scientific innovations in the early 20 th century was the development of the analysis of variance (ANOVA) and its use in analyzing factorial designs . A full factorial design is one that includes multiple independent variables (factors), with experimental conditions set up to obtain measurements under each combination of levels of factors. This approach allows experimenters to estimate the significance of each factor individually (main effects) and see how the different levels of the factors might behave differently in combination (interactions). This is all great when the factorial design is complete, but when it’s incomplete, it becomes impossible to untangle potential interactions among the factors.
For example, imagine an experiment in which participants sort cards and there are two independent variables—the size of the cards (small and large) and the size of the print on the cards (small and large). This is the simplest full factorial experiment, having two independent variables (card size and print size), each with two levels (small and large). For this 2×2 factorial experiment, there are four experimental conditions:
The graph below shows hypothetical results for this imaginary experiment. There is an interaction such that the combination of large cards and large print led to a faster sort time (45 s), but all the other conditions have the same sort time (60 s).
But what if for some reason the experimenter had not collected data for the small card/small print condition? If you averaged across card size, you’d get the same average as you would collapsing the data over print size, which would be (60+45)/2 = 52.5. An experimenter focused on the effect of print size might claim that the data show a benefit to larger prints, but the counterargument would be that the effect is due to card size instead. With this incomplete design, you couldn’t say with certainty whether the benefit in the large card/large print condition was due to card size, print size, or that specific combination.
Moving from hypothetical to published experiments, we first show confounding in a famous psychological study, then in a somewhat less famous but influential human factors study, and finally in UX measurement research.
In the late 1950s and early 1960s, psychologist Harry Harlow conducted a series of studies with infant rhesus monkeys, most of which would be considered unethical by modern standards. In his most famous study, infant monkeys were removed from their mothers and given access to two surrogate mothers, one made of terry cloth (providing tactile comfort but no food) and one made of wire with a milk bottle (providing food but no tactile comfort). The key finding was that the infant monkeys preferred to spend more time close to the terry cloth mother, using the wire mother only to feed. The image below shows both mothers.
Image from Wikipedia.
In addition to the manipulation of comfort and food, there was also a clear manipulation of the surrogate mothers’ faces. The terry cloth mother’s face was rounded and had ears, nose, big eyes, and a smile. The wire mother’s face was square and devoid of potentially friendly features. With this lack of control, it’s possible that the infants’ preference for the terry cloth mother might have been due to just tactile comfort, just the friendly face, or a combination of the two. In addition to ethical issues associated with traumatizing infant monkeys, the experiment was deeply confounded.
Typing keyboards have been around for over 100 years, and there has been a lot of research on their design —different types of keys, different key layouts, and from the 1960s through the 1990s, different keyboard configurations. Specifically, researchers conducted studies of different types of split keyboards intended to make typing more comfortable and efficient by allowing a more natural wrist posture. The first design of a split keyboard was the Klockenberg keyboard, described in his 1926 book .
One of the most influential papers promoting split keyboards was “ Studies on Ergonomically Designed Alphanumeric Keyboards ” by Nakaseko et al., published in 1985 in the journal Human Factors. In that study, they described an experiment in which participants used three different keyboards—a split keyboard with a large wrist rest (see the figure below), a split keyboard with a small wrist rest, and a standard keyboard with a large wrist rest. They did not provide a rationale for failing to include a standard keyboard with a small wrist rest, and this omission made their experiment an incomplete factorial.
Image from Lewis et al. (1997) “ Keys and Keyboards .”
They had participants rank the keyboards by preference, with the following results:
Rank | Split with Large Rest | Split with Small Rest | Standard with Large Rest |
---|---|---|---|
1 | 16 | 7 | 9 |
2 | 6 | 13 | 11 |
3 | 9 | 11 | 11 |
The researchers’ primary conclusion was “After the typing tasks, about two-thirds of the subjects asserted that they preferred the split keyboard models.” This is true because 23/32 participants’ first choice was a split keyboard condition. What they failed to note was that 25/32 participants’ first choice was a keyboard condition that included a large wrist rest. If they had collected data for with a standard keyboard and small wrist rest, it would have been possible to untangle the potential interaction—but they didn’t.
In recent articles, we explored the effect of verbal labeling of rating scale response options; specifically, whether partial or full labeling affects the magnitude of responses, first in a literature review , and then in a designed experiment .
One of the papers in our literature review was Krosnick and Berent (1993) [pdf]. They reported the results of a series of political science studies investigating the effects of full versus partial labeling of response options and branching. In the Branching condition, questions were split into two parts, with the first part capturing the direction of the response (e.g., “Are you a Republican, Democrat, or independent?”) and the second capturing the intensity (e.g., “How strong or weak is your party affiliation?”). In the Nonbranching condition, both direction and intensity were captured in one question. The key takeaway from their abstract was, “We report eight experiments … demonstrating that fully labeled branching measures of party identification and policy attitudes are more reliable than partially labeled nonbranching measures of those attitudes. This difference seems to be attributable to the effects of both verbal labeling and branching.”
If all you read was the abstract, you’d think that full labeling was a better measurement practice than partial labeling. But when you review research, you can’t just read and accept the claims in the abstract. The figure below shows part of Table 1 from Krosnick and Berent (1993). Note that they list only three question formats. If their experimental designs had been full factorials, there would have been four. Missing from the design is the combination of partial labeling and branching. The first four studies also omitted the combination of full labeling with nonbranching, so any “significant” findings in those studies could be due to labeling or branching differences.
Image from Krosnick and Berent (1993) [pdf].
The fifth study at least included the Fully Labeled Nonbranching condition and produced the following results (numbers in cells are the percentage of respondents who gave the same answer on two different administrations of the same survey questions):
Full | Partial | Diff | |
---|---|---|---|
Branching | 68.4% | NA | NA |
Nonbranching | 57.8% | 58.9% | 1.1% |
Diff | 10.6% | NA |
To analyze these results, Krosnick and Berent conducted two tests, one on the differences between Branching and Nonbranching holding Full Labeling constant and the second on the differences between Full and Partial Labeling holding Nonbranching constant. They concluded there was a significant effect of branching but no significant effect of labeling, bringing into question the claim they made in their abstract.
If you really want to understand the effects of labeling and branching on response consistency, the missing cell in the table above is a problem. Consider two possible hypothetical sets of results, one in which the missing cell matches the cell to its left and one in which it matches the cell below.
Full | Partial | Mean | |
---|---|---|---|
Branching | 68.4% | 68.4% | 0.0% |
Nonbranching | 57.8% | 58.9% | 1.1% |
Difference | 10.6% | 9.5% |
Full | Partial | Mean | |
---|---|---|---|
Branching | 68.4% | 58.9% | -9.5% |
Nonbranching | 57.8% | 58.9% | 1.1% |
Difference | 10.6% | 0.0% |
In the first hypothetical, the conclusion would be that branching is more reliable than nonbranching and labeling doesn’t matter. For the second hypothetical, the conclusion would be that there is an interaction suggesting that full labeling is better than partial, but only for branching questions and not for nonbranching. But without data for the missing cell, you just don’t know!
When reading published research, it’s important to read critically. One aspect of critical reading is to identify whether the design of the reported experiment is confounded in a way that casts doubt on the researchers’ claims.
This is not a trivial issue, and as we’ve shown, influential research has been published that has affected social policy (Harlow’s infant monkeys), product claims (split keyboards), and survey design practices (labeling and branching). But upon close and critical inspection, the experimental designs were flawed by virtue of confounding; specifically, the researchers were drawing conclusions from incomplete factorial experimental designs.
In future articles, we’ll revisit this topic from time to time with analyses of other published experiments we’ve reviewed that, unfortunately, were confounded.
Eeg-based study of design creativity: a review on research design, experiments, and analysis.
Brain dynamics associated with design creativity tasks are largely unexplored. Despite significant strides, there is a limited understanding of the brain-behavior during design creation tasks. The objective of this paper is to review the concepts of creativity and design creativity as well as their differences, and to explore the brain dynamics associated with design creativity tasks using electroencephalography (EEG) as a neuroimaging tool. The paper aims to provide essential insights for future researchers in the field of design creativity neurocognition. It seeks to examine fundamental studies, present key findings, and initiate a discussion on associated brain dynamics. The review employs thematic analysis and a forward and backward snowball search methodology with specific inclusion and exclusion criteria to select relevant studies. This search strategy ensured a comprehensive review focused on EEG-based creativity and design creativity experiments. Different components of those experiments such as participants, psychometrics, experiment design, and creativity tasks, are reviewed and then discussed. The review identifies that while some studies have converged on specific findings regarding EEG alpha band activity in creativity experiments, there remain inconsistencies in the literature. The paper underscores the need for further research to unravel the interplays between these cognitive processes. This comprehensive review serves as a valuable resource for readers seeking an understanding of current literature, principal discoveries, and areas where knowledge remains incomplete. It highlights both positive and foundational aspects, identifies gaps, and poses lingering questions to guide future research endeavors.
1.1 creativity, design, and design creativity.
Investigating design creativity presents significant challenges due to its multifaceted nature, involving nonlinear cognitive processes and various subtasks such as divergent and convergent thinking, perception, memory retrieval, learning, inferring, understanding, and designing ( Gero, 1994 ; Gero, 2011 ; Nguyen and Zeng, 2012 ; Jung and Vartanian, 2018 ; Xie, 2023 ). Additionally, design creativity tasks are often ambiguous, intricate, and nonlinear, further complicating efforts to understand the underlying mechanisms and the brain dynamics associated with creative design processes.
Creativity, one of the higher-order cognitive processes, is defined as the ability to develop useful, novel, and surprising ideas ( Sternberg and Lubart, 1998 ; Boden, 2004 ; Runco and Jaeger, 2012 ; Simonton, 2012 ). Needless to say, creativity occurs in all parts of social and personal life and all situations and places, including everyday cleverness, the arts, sciences, business, social interaction, and education ( Mokyr, 1990 ; Cropley, 2015b ). However, this study particularly focuses on reviewing EEG-based studies of creativity and design creativity tasks.
Design, as a fundamental and widespread human activity, aiming at changing existing situations into desired ones ( Simon, 1996 ), is nonlinear and complex ( Zeng, 2001 ), and lies at the heart of creativity ( Guilford, 1959 ; Gero, 1996 ; Jung and Vartanian, 2018 ; Xie, 2023 ). According to the recursive logic of design ( Zeng and Cheng, 1991 ), a designer intensively interacts with the design problem, design environment (including stakeholders of design, design context, and design knowledge), and design solutions in the recursive environment-based design evolution process ( Zeng and Gu, 1999 ; Zeng, 2004 , 2015 ; Nagai and Gero, 2012 ). Zeng (2002) conceptualized the design process as an environment-changing process in which the product emerges from the environment, serves the environment, and changes the environment ( Zeng, 2015 ). Convergent and divergent thinking are two primary modes of thinking in the design process, which are involved in analytical, critical, and synthetic processes. Divergent thinking leads to possible solutions, some of which might be creative, to the design problem whereas convergent thinking will evaluate and filter the divergent solutions to choose appropriate and practical ones ( Pahl et al., 1988 ).
Creative design is inherently unpredictable; at times, it may seem implausible – yet it happens. Some argue that a good design process and methodology form the foundation of creative design, while others emphasize the significance of both design methodology and knowledge in fostering creativity. It is noteworthy that different designers may propose varied solutions to the same design problem, and even the same designer might generate diverse design solutions for the same problem over time ( Zeng, 2001 ; Boden, 2004 ). Creativity may spontaneously emerge even if one does not intend to conduct a creative design, whereas creative design just may not come out no matter how hard one tries. A design is considered routine if it operates within a design space of known and ordinary designs, innovative if it navigates within a defined state space of potential designs but yields different outcomes, and creative if it introduces new variables and structures into the space of potential designs ( Gero, 1990 ). Moreover, it is conceivable that a designer may lack creativity while the product itself demonstrates creative attributes, and conversely, a designer may exhibit creativity while the resulting product does not ( Yang et al., 2022 ).
Several models of design creativity have been proposed in the literature. In some earlier studies, design creativity was addressed as engineering creativity or creative problem-solving ( Cropley, 2015b ). Used in recent studies ( Jia et al., 2021 ; Jia and Zeng, 2021 ), the stages of design creativity include problem understanding, idea generation, idea evolution, and idea validation ( Guilford, 1959 ). Problem understanding and idea evaluation are assumed to be convergent cognitive tasks whereas idea generation and idea evolution are considered divergent tasks in design creativity. An earlier model of creative thinking proposed by Wallas (1926) is presented in four phases including preparation, incubation, illumination, and verification ( Cropley, 2015b ). The “Preparation” phase involves understanding a topic and defining the problem. During “Incubation,” one processes the information, usually subconsciously. In the “Illumination” phase, a solution appears, often unexpectedly. Lastly, “Verification” involves evaluating and implementing the derived solution. In addition to this model, a seven-phase model (an extended version of the 4-phase model) was later introduced containing preparation, activation, generation, illumination, verification, communication, and validation ( Cropley, 2015a , b ). It is crucial to emphasize that these phases are not strictly sequential or distinct in that interactions, setbacks, restarts, or premature conclusions might occur ( Haner, 2005 ). In contrast to those emperical models of creativity, the nonlinear recursive logic of design creativity was rigorously formalized in a mathematical design creativity theory ( Zeng, 2001 ; Zeng et al., 2004 ; Zeng and Yao, 2009 ; Nguyen and Zeng, 2012 ). For further details on the theories and models of creativity and design creativity, readers are directed to the referenced literature ( Gero, 1994 , 2011 ; Kaufman and Sternberg, 2010 ; Williams et al., 2011 ; Nagai and Gero, 2012 ; Cropley (2015b) ; Jung and Vartanian, 2018 ; Yang et al., 2022 ; Xie, 2023 ).
First, we would like to provide the definitions of “design” and “creativity” which can be integrated into the definition of “design creativity.” According to the Cambridge Dictionary, the definition of design is: “to make or draw plans for something.” In addition, the definition of creativity is: “the ability to make something new or imaginative.” So, the definition of design creativity is: “the ability to design something new and valuable.” With these definitions, we focus on design creativity neurocognition in this section.
It is of great importance to study design creativity neurocognition as the brain plays a pivotal role in the cognitive processes underlying design creativity tasks. So, to better investigate design creativity we need to concentrate on brain mechanisms associated with the related cognitive processes. However, the complexity of these tasks has led to a significant gap in our understanding; consequently, our knowledge about the neural activities associated with design creativity remains largely limited and unexplored. To address this gap, a burgeoning field known as design creativity neurocognition has emerged. This field focuses on investigating the intricate and unstructured brain dynamics involved in design creativity using various neuroimaging tools such as electroencephalography (EEG).
In a nonlinear evolutionary model of design creativity, it is suggested that the brain handles problems and ideas in a way that leads to unpredictable and potentially creative solutions ( Zeng, 2001 ; Nguyen and Zeng, 2012 ). This involves cognitive processes like thinking of ideas, evolving and evaluating them, along with physical actions like drawing ( Zeng et al., 2004 ; Jia, 2021 ). This indicates that the brain, as a complex and nonlinear system with characteristics like emergence and self-organization, goes through several cognitive processes which enable the generation of creative ideas and solutions. Exploring brain activities during design creativity tasks helps us get a better insight into the design process and improves how designers perform. As a result, design neurocognition combines traditional design study methods with approaches from cognitive neuroscience, neurophysiology, and artificial intelligence, offering unique perspectives on understanding design thinking ( Balters et al., 2023 ). Although several studies have focused on design and creativity, brain dynamics associated with design creativity are largely untouched. It motivated us to conduct this literature review to explore the studies, gather the information and findings, and finally discuss them. Due to the advantages of electroencephalography (EEG) in design creativity experiments which will be explained in the following paragraphs, we decided to focus on EEG-based neurocognition in design creativity.
As mentioned before, design creativity tasks are cognitive activities which are complex, dynamic, nonlinear, self-organized, and emergent. The brain dynamics of design creativity are largely unknown. Brain behavior recognition during design-oriented tasks helps scientists investigate neural mechanisms, vividly understand design tasks, enhance whole design processes, and better help designers ( Nguyen and Zeng, 2014a , b , 2017 ; Liu et al., 2016 ; Nguyen et al., 2018 , 2019 ; Zhao et al., 2018 , 2020 ; Jia, 2021 ; Jia et al., 2021 ; Jia and Zeng, 2021 ). Exploring brain neural circuits in design-related processes has recently gained considerable attention in different fields of science. Several studies have been conducted to decode brain activity in different steps of design creativity ( Petsche et al., 1997 ; Nguyen and Zeng, 2010 , 2014a , b , 2017 ; Liu et al., 2016 ; Nguyen et al., 2018 ; Vieira et al., 2019 ). Such attempts will lead to investigating the mechanism and nature of the design creativity process and consequently enhance designers’ performance ( Balters et al., 2023 ). The main question of the studies performed in design creativity neurocognition is whether and how we can explore brain dynamics and infer designers’ cognitive states using neuro-cognitive and physiological data like EEG signals.
Neuroimaging is a vital tool in understanding the brain’s structure and function, offering insights into various neurological and psychological conditions. It employs a range of techniques to visualize the brain’s activity and structure. Neuroimaging methods mainly include magnetic resonance imaging (MRI), computed tomography (CT), electroencephalography (EEG), functional near-infrared spectroscopy (fNIRS), functional MRI (fMRI), and magnetoencephalography (MEG). Neuroimaging techniques have helped researchers explore brain dynamics in complex cognitive tasks, one of which is design creativity ( Nguyen and Zeng, 2014b ; Gao et al., 2017 ; Zhao et al., 2020 ). While several neuroimaging methods exist to study brain activity, electroencephalography (EEG) is one of the best methods which has been widely used in several studies in different applications. EEG, as an inexpensive and simple neuroimaging technique with a high temporal resolution and an acceptable spatial resolution, has been used to infer designers’ cognitive and emotional states. Zangeneh Soroush et al. (2023a , b) have recently introduced two comprehensive datasets encompassing EEG recordings in design and creativity experiments, stemmed from several EEG-based design and design creativity studies ( Nguyen and Zeng, 2014a ; Nguyen et al., 2018 , 2019 ; Jia, 2021 ; Jia et al., 2021 ; Jia and Zeng, 2021 ). In this paper, we review some of the most fundamental studies which have employed electroencephalography (EEG) to explore brain behavior in creativity and design creativity tasks.
EEG stands out as a highly promising method for investigating brain dynamics across various fields, including cognitive, clinical, and computational neuroscience studies. In the context of design creativity, EEG offers a valuable means to explore brain activity, particularly considering the physical movements inherent in the design process. However, EEG analysis poses challenges due to its complexity, nonlinearity, and susceptibility to various artifacts. Therefore, gaining a comprehensive understanding of EEG and mastering its utilization and processing is crucial for conducting effective experiments in design creativity research. This review aims to examine studies that have utilized EEG in investigating design creativity tasks.
EEG is a technique for recording the electrical activity of the brain, primarily generated by neuronal firing within the human brain. This activity is almost always captured non-invasively from the scalp in most cognitive studies, though intracranial EEG (iEEG) is recorded inside the skull, for instance in surgical planning for epilepsy. EEG signals are the result of voltage differences measured across two points on the scalp, reflecting the summed synchronized synaptic activities of large populations of cortical neurons, predominantly from pyramidal cells ( Teplan, 2002 ; Sanei and Chambers, 2013 ).
While the spatial resolution of EEG is relatively poor, EEG offers excellent temporal resolution, capturing neuronal dynamics within milliseconds, a feature not matched by other neuroimaging modalities like functional Near-Infrared Spectroscopy (fNIRS), Positron Emission Tomography (PET), or functional Magnetic Resonance Imaging (fMRI).
In contrast, fMRI provides much higher spatial resolution, offering detailed images of brain activity by measuring blood flow changes associated with neuronal activity. However, fMRI’s temporal resolution is lower than EEG, as hemodynamic responses are slower than electrical activities. PET, like fMRI, offers high spatial resolution and involves tracking a radioactive tracer injected into the bloodstream to image metabolic processes in the brain. It is particularly useful for observing brain metabolism and neurochemical changes but is invasive and has limited temporal resolution. fNIRS, measuring hemodynamic responses in the brain via near-infrared light, stands between EEG and fMRI in terms of spatial resolution. It is non-invasive and offers better temporal resolution than fMRI but is less sensitive to deep brain structures compared to fMRI and PET. Each of these techniques, with their unique strengths and limitations, provides complementary insights into brain function ( Baillet et al., 2001 ; Sanei and Chambers, 2013 ; Choi and Kim, 2018 ; Peng, 2019 ).
This understanding of EEG, from its historical development by Hans Berger in 1924 to modern digital recording and analysis techniques, underscores its significance in studying brain function and diagnosing neurological conditions. Despite advancements in technology, the fundamental methods of EEG recording have remained largely unchanged, emphasizing its enduring relevance in neuroscience ( Teplan, 2002 ; Choi and Kim, 2018 ).
Balters et al. (2023) conducted a comprehensive systematic review including 82 papers on design neurocognition covering nine topics and a large variety of methodological approaches in design neurocognition. A systematic review ( Pidgeon et al., 2016 ), reported several EEG-based studies on functional neuroimaging of visual creativity. Although such a comprehensive review exists in the field of design neurocognition, just a few early reviews focused on creativity neurocognition ( Fink and Benedek, 2014 , 2021 ; Benedek and Fink, 2019 ).
The present review not only reports the studies but also critically discusses the previous findings and results. The rest of this paper is organized as follows: Section 2 introduces our review methodology; Section 3 presents the results from our review process, and Section 4 discusses the major implications of the existing design creativity neurocognition research in future studies. Section 5 concludes the paper.
Figure 1 shows the main components of EEG-based design creativity studies: (1) experiment design, (2) participants, (3) psychometric tests, (4) experiments (creativity tasks), (5) EEG recording and analysis methods, and (6) final data analysis. The experiment design consists of experiment protocol which includes (design) creativity tasks, the criteria to choose participants, the conditions of the experiment, and recorded physiological responses (which is EEG here). Setting and adjusting these components play a crucial role in successful experiments and reliable results. In this paper, we review studies based on the components in Figure 1 .
Figure 1 . The main components of EEG-based design creativity studies.
The components described in Figure 1 are consistent with the stress-effort model proposed by Nguyan and Zeng ( Nguyen and Zeng, 2012 ; Zhao et al., 2018 ; Yang et al., 2021 ) which characterizes the relationship between mental stress and mental effort by a bell-shaped curve. This model defines mental stress as a ratio of the perceived task workload over the mental capability constituted by affect, skills, and knowledge. Knowledge is shaped by individual experience and understanding related to the given task workload. Skills encompass thinking styles, strategies, and reasoning ability. The degree of affect in response to a task workload can influence the effective utilization of the skills and knowledge. We thus used this model to form our research questions, determine the keywords, and conduct our analysis and discussions.
We focused on the studies assessing brain function in design creativity experiments through EEG analysis. For a comprehensive review, we followed a thorough search strategy, called thematic analysis ( Braun and Clarke, 2012 ), which helped us to code and extract themes from the initial (seed) papers. We began without a fixed topic, immersing ourselves in the existing literature to shape our research questions, keywords, and search queries. Our research questions formed the search keywords and later the search inquiries.
Our main research questions (RQs) were:
RQ1: What are the effective experiment design and protocol to ensure high-quality EEG-based design creativity studies?
RQ2: How can we efficiently record, preprocess, and process EEG reflecting the cognitive workload associated with design creativity tasks?
RQ3: What are the existing methods to analyze the data extracted from EEG signals recorded during design creativity tasks?
RQ4: How can EEG signals provide significant insight into neural circuits and brain dynamics associated with design creativity tasks?
RQ5: What are the significant neuroscientific findings, shortcomings, and inconsistencies in the literature?
With the initial information extracted from the seed papers and the previous studies by the authors in this field ( Nguyen and Zeng, 2012 , 2014a , b ; Jia et al., 2021 ; Jia and Zeng, 2021 ; Yang et al., 2022 ; Zangeneh Soroush et al., 2024 ), we built a conceptual model represented by Figure 1 and then formed these research questions. With this understanding and the RQs, we set our search strategy.
Our search started with broad terms like “design,” “creativity,” and “EEG.” These terms encapsulate the overarching cognitive activities and physiological measurement. As we identified relevant papers, we refined our search keywords for a more targeted search. We utilized the Boolean operators such as “OR” and “AND” to finetune our search inquiries. The search inquiries were enhanced by the authors through the knowledge they obtained through selected papers. The first phase started with thematic analysis and continued with choosing papers, obtaining knowledge, discussing the keywords, and updating the search inquiries, recursively until reaching an appropriate search inquiry which resulted in the desired search results. We applied the thematic analysis only in the first iteration to make sure that we had the right and comprehensive understanding of EEG-based design creativity, the appropriate set of keywords, and search inquiries. Finally, we came up with a comprehensive search inquiry as follows:
(“EEG” OR “Electroenceph*” OR “brain” OR “neur*” OR “neural correlates” OR “cognit*”) AND (“design creativity” OR “ideation” OR “creative” OR “divergent thinking” OR “convergent thinking” OR “design neurocognition” OR “creativity” OR “creative design” OR “design thinking” OR “design cognition” OR “creation”)
The search inquiry is a combination of terminologies related to design and creativity, as well as terminologies about EEG, neural activity, and the brain. In a general and quick evaluation, we found out that our proposed search inquiry resulted in relevant studies in the field. This evaluation was a quick way to check how effectively our search keywords work. Then, we went through well-known databases such as PubMed, Scopus, and Web of Science to collect a comprehensive set of original papers, theses, and reviews. These electronic databases were searched to reduce the risk of bias, to get more accurate findings, and to provide coverage of the literature. We continued our search in the aforementioned databases until no more significant papers emerged from those specific databases. It is worth mentioning that we do not consider any specific time interval in our search procedure. We used the fields “title,” “abstract,” and “keywords” in our search process. Then, we selected the papers based on the following inclusion criteria:
1. The paper should answer one or more research questions (RQ1-RQ5).
2. The paper must be a peer-reviewed journal paper authored in English.
3. The paper should focus on EEG analysis related to creativity or design creativity for adult participants.
4. The paper should be related to creativity or design creativity in terms of the concepts, experiments, protocols, and probable models employed in the studies.
5. The paper should use established creativity tasks, including the Alternative Uses Task (AUT), the Torrance Tests of Creative Thinking (TTCT), or a specific design task. (These tasks will be detailed further on.)
6. The paper should include a quantitative analysis of EEG signals in the creativity or design creativity domain.
7. In addition to the above-mentioned criteria, the authors checked the papers to make sure that the included publications have the characteristics of high-quality papers.
These criteria were used to select our initial papers from the large set of papers we collected from Scopus, Web of Science, and PubMed. It should be mentioned that conflicts were resolved through discussion and duplicate papers were removed.
After our initial selection, we used Google Scholar to perform a forward and backward snowball search approach. We chose the snowball search method over the systematic review approach as the forward and backward snowball search methodologies offer efficient alternatives to a systematic review. Unlike systematic reviews, the snowball search method is particularly valuable when dealing with emerging fields or when the scope of inquiry is evolving, allowing researchers to quickly uncover pertinent insights and form connections between seminal and contemporary works. During each iteration of the snowball search, we applied the aforementioned criteria to include or exclude papers accordingly. We continued our snowball search procedure until it converged to the final set of papers. After repeating this over six iterations, we found no new and significant papers, suggesting we had reached a convergent set of papers.
By October 1 st (2023), our search was complete. We then organized and studied the final included publications.
Figure 2 illustrates the general flow of our search procedure, adapted from PRISMA guidelines ( Liberati et al., 2009 ). With the search keywords, we identified 1878 studies during the thematic analysis phase. We considered these studies to select the seed papers for the further snowball search process. After performing the snowball search and considering inclusion and exclusion criteria, we finally selected 154 studies including 82 studies related to creativity (comprising 60 original papers, 12 theses, and 10 review papers) and 72 studies related to design creativity (comprising 63 original papers, 5 theses, and 4 review papers). In our search, we also found 6 related textbooks and 157 studies using other modalities (such as fMRI, fNIRS, etc.) which were excluded. We used these textbooks, theses, and their resources to gain more knowledge in the initial steps of our review. Some studies using fMRI and fNIRS were used to evaluate the results in the discussion. In the snowball search process, a large number of studies have consistently appeared across all iterations implying their high relevance and influence in the field. These papers, which have been repeatedly selected throughout the search process, demonstrate their significant contributions to the understanding of design creativity and EEG studies. The snowball process effectively identifies such pivotal studies by highlighting their recurrent presence and citation in the literature, underscoring their importance in shaping the research landscape.
Figure 2 . Search procedure and results (adopted from PRISMA) using the thematic analysis in the first iteration and snowball search in the following iterations.
As discussed in Section 1, creativity and design creativity studies are different yet closely related in that design creativity involves a more complex design process. In this subsection, we will look at how the design neurocognition creativity study followed the creativity neurocognition study (though not necessarily in a causal manner).
Three early studies in the field of creativity neurocognition are Martindale and Mines (1975) , Martindale and Hasenfus (1978) , and Martindale et al. (1984) . In the first study ( Martindale and Mines, 1975 ), it is stated that creative individuals may exhibit certain traits linked to lower cortical activation. This research has shown distinct neural activities when participants engage in two creativity tasks: the Alternate Uses Tasks (AUT) and the Remote Associate Task (RAT). The AUT, which gauges ideational fluency and involves unfocused attention, is related to higher alpha power in the brain. Conversely, the RAT, which centers on producing specific answers, shows varied alpha levels. Previous psychological research aligns with these findings, emphasizing the different nature of these tasks. Creativity, as determined by both tests, is associated with high alpha percentages during the AUT, hinting at an association between creativity and reduced cortical activation during creative tasks. However, highly creative individuals also show a mild deficit in cortical self-control, evident in their increased alpha levels, even when trying to suppress them. This behavior mirrors findings from earlier and later studies and implies that these individuals might have a predisposition to disinhibition. The varying alpha levels during cognitive tasks likely stem from their reaction to tasks rather than intentional focus shifts ( Martindale and Mines, 1975 ).
In the second study ( Martindale and Hasenfus, 1978 ), the authors explored the relationship between creativity and EEG alpha band presence during different stages of the creative process. There were two experiments in this study. Experiment 1 found that highly creative individuals had lower alpha wave presence during the elaboration stage of the creative process, while Experiment 2 found that effort to be original during the inspiration stage was associated with higher alpha wave presence. Overall, the findings suggest that creativity is associated with changes in EEG alpha wave presence during different stages of the creative process. However, the relationship is complex and may depend on factors such as effort to be original and the specific stage of the creative process.
Finally, a series of three studies indicated a link between creativity and hemispheric asymmetry during creative tasks ( Martindale et al., 1984 ). Creative individuals typically exhibited heightened right-hemisphere activity compared to the left during creative output. However, no distinct correlation was found between creativity and varying levels of hemispheric asymmetry during the inspiration versus elaboration phases. The findings suggest that this relationship is consistent across different stages of creative production. These findings were the foundation of design creativity studies which were more explored later and confirmed by other studies ( Petsche et al., 1997 ). Later studies have used these findings to validate their results. In addition to these early studies, there exist several reviews such as Fink and Benedek (2014) , Pidgeon et al. (2016) , and Rominger et al. (2022a) which provide a comprehensive literature review of previous studies and their main findings including early studies as well as recent creativity research.
In the preceding sections, we aimed to lay a foundational understanding of neurocognition in creativity, equipping readers with essential knowledge for the subsequent content. In this subsection, we will briefly introduce the main and most important points regarding creativity experiments. More detailed information can be found in Simonton (2000) , Srinivasan (2007) , Arden et al. (2010) , Fink and Benedek (2014) , Pidgeon et al. (2016) , Lazar (2018) , and Hu and Shepley (2022) .
This section presents key details from the selected studies in a structured format to facilitate easy understanding and comparison for readers. As outlined earlier, crucial elements in creativity research include the participants, psychometric tests used, creativity tasks, EEG recording and analysis techniques, and the methods of final data analysis. We have organized these factors, along with the principal findings of each study, into Table 1 . This approach allows readers to quickly grasp the essential information and compare various aspects of different studies. The table format not only aids in presenting data clearly and concisely but also helps in highlighting similarities and differences across studies, providing a comprehensive overview of the field. Following the table, we have included a discussion section. This discussion synthesizes the information from the table, offering insights and interpretations of the trends, implications, and significance of these studies in the broader context of creativity neurocognition. This structured presentation of studies, followed by a detailed discussion, is designed to enhance the reader’s understanding, and provide a solid foundation for future research in this dynamic and evolving field.
Table 1 . A summary of EEG-based creativity neurocognition studies.
In our research, we initially conducted a thematic analysis and utilized a forward and backward snowball search method to select relevant studies. Out of these, five studies employed machine learning techniques, while the remaining ones concentrated on statistically analyzing EEG features. It is noteworthy that all the chosen studies followed a similar methodology, involving the recruitment of participants, administering probable psychometric tests, conducting creativity tasks, recording EEG data, and concluding with final data analysis.
While most studies follow similar structure for their experiments, some other studies focus on other aspects of creativity such as artistic creativity and poetry, targeting different evaluation methods, and through different approaches. In Shemyakina and Dan’ko (2004) and Danko et al. (2009) , the authors targeted creativity to produce proverbs or definitions of emotions of notions. In other studies ( Leikin, 2013 ; Hetzroni et al., 2019 ), the experiments are focused on creativity and problem-solving in autism and bilingualism. Moreover, some studies such as Volf and Razumnikova (1999) and Razumnikova (2004) focus more on the gender differences in brain organization during creativity tasks. In another study ( Petsche, 1996 ), approaches to verbal, visual, and musical creativity were explored through EEG coherence analysis. Similarly, the study ( Bhattacharya and Petsche, 2005 ) analyzed brain dynamics in mentally composing drawings through differences in cortical integration patterns between artists and non-artists. We summarized the findings of EEG-based creativity studies in Table 1 .
Design is closely associated with creativity. On the one hand, by definition, creativity is a measure of the process of creating, for which design, either intentional or unconscious, is an indispensable constituent. On the other hand, it is important to note that not all designs are inherently creative; many designs follow established patterns and resemble existing ones, differing only in their specific context. Early research on design creativity aimed to differentiate between design and design creativity tasks by examining when and how designers exhibited creativity in their work. In recent years, much of the focus in design creativity research has shifted towards cognitive and neurocognitive investigations, as well as the development of computational models to simulate creative processes ( Borgianni and Maccioni, 2020 ; Lloyd-Cox et al., 2022 ). Neurocognitive studies employ neuroimaging methods (such as EEG) while computational models often leverage artificial intelligence or cognitive modeling techniques ( Zeng and Yao, 2009 ; Gero, 2020 ; Gero and Milovanovic, 2020 ). In this section, we review significant EEG-based studies in design creativity to focus more on design creation and highlight the differences. While most studies have processed EEG to provide more detailed insight into brain dynamics, some studies such as Goel (2014) outlined a preliminary framework encompassing cognitive and neuropsychological systems essential for explaining creativity in designing artifacts.
Several studies have recorded and analyzed EEG in design and design creativity tasks. Most neuro-cognitive studies have directly or indirectly employed frequency-based analysis which is based on the analysis of EEG in specific frequency bands including delta (0.5–4 Hz), theta (4–8 Hz), alpha (8–13 Hz), beta (13–30 Hz), and gamma (>30 Hz). One of the main analyses is called task-related potential (TRP) which has provided a foundation for other analyses. It computes the relative power of the EEG signal associated with a design task in a specific frequency band with respect to the power of EEG in the rest mode. This analysis is simple and effective and reveals the physiological processes underlying EEG dynamics ( Rominger et al., 2018 ; Jia and Zeng, 2021 ; Gubler et al., 2022 ; Rominger et al., 2022b ).
Frequency-based analyses have been widely employed. For example, the study ( Borgianni and Maccioni, 2020 ) applied TRP analysis to compare the neurophysiological activations of mechanical engineers and industrial designers while conducting design tasks including problem-solving, basic design, and open design. These studies have agreed that higher alpha band activity is sensitive to specific task-related requirements, while the lower alpha corresponds to attention processes such as vigilance and alertness ( Klimesch et al., 1998 ; Klimesch, 1999 ; Chrysikou and Gero, 2020 ). Higher alpha activity in the prefrontal region reflects complex cognitive processes, higher internal attention (such as in imagination), and task-irrelevant inhibition ( Fink et al., 2009a , b ; Fink and Benedek, 2014 ). On the other hand, higher alpha activity in the occipital and temporal lobes corresponds to visualization processes ( Vieira et al., 2022a ). In design research, to compare EEG characteristics in design activities (such as idea generation or evaluation) ( Liu et al., 2016 ), frequency-based analysis has been widely employed ( Liu et al., 2018 ). Higher alpha is associated with open-ended tasks, visual association in expert designers, and divergent thinking ( Nguyen and Zeng, 2014b ; Nguyen et al., 2019 ). Higher beta and theta play a pivotal role in convergent thinking, and constraint tasks ( Nguyen and Zeng, 2010 ; Liu et al., 2016 ; Liang and Liu, 2019 ).
The research in design and design creativity is not limited to frequency-based analyses. Nguyen et al. (2019) introduced Microstate analysis to EEG-based design studies. Using the microstate analysis, Jia and Zeng investigated EEG characteristics in design creativity experiment ( Jia and Zeng, 2021 ), where EEG signals were recorded while participants conducted design creativity experiments which were modified TTCT tasks ( Nguyen and Zeng, 2014b ).
Following the same approach, Jia et al. (2021) analyzed EEG microstates to decode brain dynamics in design cognitive states including problem understanding, idea generation, rating idea generation, idea evaluation, and rating idea evaluation, where six design problems including designing a birthday cake, a toothbrush, a recycle bin, a drinking fountain, a workplace, and a wheelchair were used for the EEG based design experimental studies ( Nguyen and Zeng, 2017 ). The data of these two loosely controlled EEG-based design experiments are summarized and available for the research community ( Zangeneh Soroush et al., 2024 ).
We summarized the findings of EEG-based design and design creativity studies in Table 2 .
Table 2 . A summary of EEG-based design creativity neurocognition studies.
The selected studies span a broad range of years, stretching from 1975 ( Martindale and Mines, 1975 ) to the present day, reflecting advancements in neuro-imaging techniques and machine learning methods that have significantly aided researchers in their investigations. From the earliest studies to more recent ones, the primary focus has centered on EEG sub-bands, brain asymmetry, coherence analysis, and brain topography. Recently, machine learning methods have been employed to classify EEG samples into designers’ cognitive states. These studies can be roughly classified into the following distinct categories based on their proposed experiments and EEG analysis methods ( Pidgeon et al., 2016 ; Jia, 2021 ): (1) visual creativity versus baseline rest/fixation, (2) visual creativity versus non-rest control task(s), (3) individuals of high versus low creativity, (4) generation of original versus standard visual images, (5) creativity in virtual reality vs. real environment, (6) loosely controlled vs. strictly controlled creativity experiments.
The included studies exhibited considerable variation in the tasks utilized and the primary contrasts examined. Some studies employed frequency-based or EEG power analysis to compare brain activity during visual creativity tasks with tasks involving verbal creativity or both verbal and visual tasks. These tasks often entail memory tasks or tasks focused on convergent thinking. Several studies, however, adopted a simpler approach by comparing electrophysiological activity during visual creativity tasks against a baseline fixation or rest condition. Some studies compared neural activities between individuals characterized by high and low levels of creativity, while others compared the generation of original creative images with that of standard creative images. Several studies analyze brain behavior concerning creativity factors such as fluency, originality, and others. These studies typically employ statistical analysis techniques to illustrate and elucidate differences between various creativity factors and their corresponding brain behaviors. This variability underscores the diverse approaches taken by researchers to examine the neural correlates of design creativity ( Pidgeon et al., 2016 ). However, few studies significantly and deeply delved into areas such as gender differences in creativity, creativity among individuals with mental or physical disorders, or creativity in diverse job positions or skill sets. This suggests that there is significant untapped potential within the EEG-based design creativity research area.
In recent years, advancements in fMRI imaging and its applications have led several studies to replace EEG with fMRI to investigate brain behavior. fMRI extracts metabolism, resulting in relatively high spatial resolution compared to EEG. However, it is important to note that fMRI has lower temporal resolution compared to EEG. Despite this difference, the shift towards fMRI highlights the ongoing evolution and exploration of neuroimaging techniques in understanding the neural correlates of design creativity. fMRI studies provide a deep understanding of neural circuits associated with creativity and can be used to evaluate EEG-based studies ( Abraham et al., 2018 ; Japardi et al., 2018 ; Zhuang et al., 2021 ).
The emergence of virtual reality (VR) has had a significant impact on design creativity studies, offering a wide range of experimentation possibilities. VR enables researchers to create diverse scenarios and creativity tasks, providing a dynamic and immersive environment for participants ( Agnoli et al., 2021 ; Chang et al., 2022 ). Through VR technology, various design creativity experiments can be conducted, allowing for novel approaches and innovative methodologies to explore the creative process. This advancement opens up new avenues for researchers to investigate the complexities of design creativity more interactively and engagingly.
Regardless of the significant progress over the past few decades, design and design creativity neurocognitive research is still in its early stages, due to the challenges identified ( Zhao et al., 2020 ; Jia et al., 2021 ), which is summarized below:
1. Design tasks are open-ended, meaning there is no single correct outcome and countless acceptable solutions are possible. There are no predetermined or optimal design solutions; multiple feasible solutions may exist for an open-ended design task.
2. Design tasks are ill-defined as finding a solution might change or redefine the original task, leading to new tasks emerging.
3. Various emergent design tasks trigger design knowledge and solutions, which in turn can change or redefine tasks further.
4. The process of completing a design task depends on emerging tasks and the perceived priorities for completion.
5. The criteria to evaluate a design solution are set by the solution itself.
While a lot of lessons learned from creativity neurocognitive research can be borrowed to study design and design creativity neurocognition, new paradigms should be proposed, tested, and validated to advance this new discipline. This advancement will in turn move forward creativity neurocognition research.
Concerning the model described in Figure 1 , we arranged the following sections to cover all the main components of EEG-based design creativity studies. To bring a general picture of the EEG-based design creativity studies, we briefly explain the procedure of such experiments. Since most design creativity neurocognition research inherited more or less procedures in general creativity research, we will focus on AUT and TTCT tasks. The introduction of a loosely controlled paradigm, tEEG, can be found in Zhao et al. (2020) , Jia et al. (2021) , and Jia and Zeng (2021) . Taking a look at Tables 1 , 2 , it can be inferred that almost all included studies record EEG signals while selected participants are performing creativity tasks. The first step is determining the sample size, recruiting participants, and psychometrics according to which participants get selected. In some of these studies, participants take psychometric tests before performing the creativity tasks for screening or categorization. In this review, the tasks used to gauge creativity are the Alternative Uses Test (AUT) and the Torrance Test of Creative Thinking (TTCT). During these tasks, EEG is recorded and then preprocessed to remove any probable artifacts. These artifact-free EEGs are then processed to extract specific features, which are subsequently subjected to either statistical analysis or machine learning methods. Statistical analysis typically compares brain dynamics across different creativity tasks like idea generation, idea evolution, and idea evaluation. Machine learning, on the other hand, categorizes EEG signals based on associated creativity tasks. The final stage involves data analysis, which aims to deduce how brain dynamics correlate with the creativity tasks given to participants. This data analysis also compares EEG results with psychometric test findings to discern any significant differences in EEG dynamics and neural activity between groups.
The first factor of the studies is their participants. In most studies, participants are right-handed, non-medicated, and have normal or corrected to normal vision. In some cases, the Edinburgh Handedness Inventory ( Oldfield, 1971 ) (with 11 elements) or hand dominance test (HDT) ( Steingrüber et al., 1971 ) were employed to determine participants’ handedness ( Rominger et al., 2020 ; Gubler et al., 2023 ; Mazza et al., 2023 ). While in several creativity studies, right-handedness has been considered; relatively, in design creativity studies it has been less mentioned.
In most studies, participants are undergraduate or graduate students with different educational backgrounds and an age range of 18 to 30 years. In the included papers, participants did not report any history of psychiatric or neurological disorders, or treatment. It should be noted that some studies such as Ayoobi et al. (2022) and Gubler et al. (2022) analyzed creativity in health conditions like multiple sclerosis or participants with chronic pain, respectively. These studies usually conduct statistical analysis to investigate the results of creativity tasks such as AUT or Remote Association Task (RAT) and then associate the results with the health condition. In some studies, it is reported that participants were asked not to smoke cigarettes for 1 h, not to have coffee for 2 h, alcohol for 12 h, or other stimulating beverages for several hours before experiments. As mentioned in some design creativity studies, similar rules apply to design creativity experiments (participants are not allowed to have stimulating beverages).
In most studies, the sample size of participants was as large as 15 up to 45 participants except for a few studies ( Jauk et al., 2012 ; Perchtold-Stefan et al., 2020 ; Rominger et al., 2022a , b ) which had larger numbers such as 100, 55, 93, and 74 participants, respectively. Some studies such as Agnoli et al. (2020) and Rominger et al. (2020) calculated their required sample size through G*power software ( Faul et al., 2007 ) concerning their desirable chance (or power) of detecting a specific interaction effect involving the response, hemisphere, and position ( Agnoli et al., 2020 ). Considering design creativity studies, the same trend can be seen as the minimum and maximum numbers of participants are 8 and 84, respectively. Similarly, in a few studies, sample sizes were estimated through statistical methods such as G*power ( Giannopulu et al., 2022 ).
In most studies, a considerable number of participants were excluded due to several reasons such as not being fluent in the language used in the experiment, left-handedness, poor quality of recorded signals, extensive EEG artifacts, misunderstanding the procedure of the experiment correctly, technical errors, losing the data during the experiment, no variance in the ratings, and insufficient behavioral data. This shows that recording a high-quality dataset is quite challenging as several factors determine whether the quality is acceptable. Two datasets (in design and creativity) with public access have recently been published in Mendeley Data ( Zangeneh Soroush et al., 2023a , b ). Except for these two datasets, to the best of our knowledge, there is no publicly available dataset of EEG signals recorded in design and design creativity experiments.
Regarding the gender analysis, among the included papers, there were a few studies which directly focused on the association between gender, design creativity, and brain dynamics ( Vieira et al., 2021 , 2022a ). In addition, most of the included papers did not choose the participants’ gender to include or exclude them. In some cases, participants’ genders were not reported.
Before the EEG recording sessions, participants are often screened using psychometric tests, which are usually employed to categorize participants based on different aspects of intellectual abilities, ideational fluency, and cognitive development. These tests provide scores on various cognitive abilities. Additionally, personality tests are used to create personas for participants. Self-report questionnaires measure traits such as anxiety, mood, and depression. Some of the psychometric tests include the Intelligenz-Struktur-Test 2000-R (I-S-T 2000 R), which assesses general mental ability and specific intellectual abilities like visuospatial, numerical, and verbal abilities. The big five test is used for measuring personality traits like conscientiousness, extraversion, neuroticism, openness to experience, and agreeableness. Other tests such as Spielberger’s state–trait anxiety inventory (STAI) are used for mood and anxiety, while the Eysenck Personality Questionnaire (EPQ-R) investigates possible personality correlates of task performance ( Fink and Neubauer, 2006 , 2008 ; Fink et al., 2009a ; Jauk et al., 2012 ; Wang et al., 2019 ). To the best of our knowledge, the included design creativity studies have not directly utilized psychometrics ( Table 2 ) to explore the association between participants’ cognitive characteristics and brain behavior. There exist a few studies which have indirectly used cognitive characteristics. For instance, Eymann et al. (2022) assessed the shared mechanisms of creativity and intelligence in creative reasoning and their correlations with EEG characteristics.
In this section, we introduce some experimental creativity tasks such as the Alternate Uses Task (AUT), and the Torrance Test of Creative Thinking (TTCT). Here, we would like to shed light on these tasks and their correlation with design creativity. One of the main characteristics of design creativity is divergent thinking as its first phase which is addressed by these two creativity tasks. In addition, AUT and TTCT are adopted and modified by several studies such as Hartog et al. (2020) , Hartog (2021) , Jia et al. (2021) , Jia and Zeng (2021) , and Li et al. (2021) for design creativity neurocognition studies. The figural version of TTCT is aligned with the goals of design creativity tasks where designers (specifically in engineering domains) create or draw their ideas, generate solutions, and evaluate and evolve generated solutions ( Srinivasan, 2007 ; Mayseless et al., 2014 ; Jia et al., 2021 ).
Furthermore, design creativity studies have introduced different types of design tasks from sequence of simple design problems to constrained and open design tasks ( Nguyen et al., 2018 ; Vieira et al., 2022a ). This variety of tasks opens new perspectives to the design creativity neurocognition studies. For example, the six design problems have been employed in some studies ( Nguyen and Zeng, 2014b ); ill-defined design tasks are used to explore brain dynamics differences between novice and expert designers ( Vieira et al., 2020d ).
The Alternate Uses Task (AUT), established by Guilford (1967) , is a prominent tool in psychological evaluations for assessing divergent thinking, an essential element of creativity. In AUT ( Guilford, 1967 ), participants are prompted to think of new and unconventional uses for everyday objects. Each object is usually shown twice – initially in the normal (common) condition and subsequently in the uncommon condition. In the common condition, participants are asked to consider regular, everyday uses for the objects. Conversely, in uncommon conditions, they are encouraged to come up with unique, inventive uses for the objects ( Stevens and Zabelina, 2020 ). The test includes several items for consideration, e.g., brick, foil, hanger, helmet, key, magnet, pencil, and pipe. In the uncommon condition, participants are asked to come up with as many uses as they can for everyday objects, such as shoes. It requires them to think beyond the typical uses (e.g., foot protection) and envision novel uses (e.g., a plant pot or ashtray). The responses in this classic task do not distinguish between the two key elements of creativity: originality (being novel and unique) and appropriateness (being relevant and meaningful) ( Runco and Mraz, 1992 ; Wang et al., 2017 ). For instance, when using a newspaper in the AUT, responses can vary from common uses like reading or wrapping to more inventive ones like creating a temporary umbrella. The AUT requires participants to generate multiple uses for everyday objects thereby measuring creativity through four main criteria: fluency (quantity of ideas), originality (uniqueness of ideas), flexibility (diversity of idea categories), and elaboration (detail in ideas) ( Cropley, 2000 ; Runco and Acar, 2012 ). In addition to the original indices of AUT, there are some creativity tests which include other indices such as fluency-valid and usefulness. Usefulness refers to how functional the ideas are ( Cropley, 2000 ; Runco and Acar, 2012 ) whereas fluency-valid, which only counts unique and non-repeated ideas, is defined as a valid number of ideas ( Prent and Smit, 2020 ). The AUT’s straightforward design and versatility make it a favored method for gauging creative capacity in diverse groups and settings, reflecting its universal applicability in creativity assessment ( Runco and Acar, 2012 ).
Developed by E. Paul Torrance in the late 1960s, the Torrance Test of Creative Thinking (TTCT) ( Torrance, 1966 ) is a foundational instrument for evaluating creative thinking. TTCT is recognized as a highly popular and extensively utilized tool for assessing creativity. Unlike the AUT, the TTCT is more structured and exists in two versions: verbal and figural. The verbal part of the TTCT, known as TTCT-Verbal, includes several subtests ( Almeida et al., 2008 ): (a) Asking Questions and Making Guesses (subtests 1, 2, and 3), where participants are required to pose questions and hypothesize about potential causes and effects; (b) Improvement of a Product (subtest 4), which involves suggesting modifications to the product; (c) Unusual Uses (subtest 5), where participants think of creative and atypical uses; and (d) Supposing (subtest 6), where participants imagine the outcomes of an unlikely event, as per Torrance. The figural component, TTCT-Figural, contains three tasks ( Almeida et al., 2008 ): (a) creating a drawing; (b) completing an unfinished drawing; and (c) developing a new drawing starting from parallel lines. An example of a figural TTCT task might involve uniquely finishing a partially drawn image, with evaluations based on the aforementioned criteria ( Rominger et al., 2018 ).
The TTCT includes a range of real-world reflective activities that encourage diverse thinking styles, essential for daily life and professional tasks. The TTCT assesses abilities in Questioning, Hypothesizing Causes and Effects, and Product Enhancement, each offering insights into an individual’s universal creative potential and originality ( Boden, 2004 ; Runco and Jaeger, 2012 ; Sternberg, 2020 ). It acts like a comprehensive test battery, evaluating multiple facets of creativity’s complex nature ( Guzik et al., 2023 ).
There are also other creativity tests such as Remote Associates Test (RAT), Runco Creativity Assessment Battery (rCAB), and Consensual Assessment Technique (CAT). TTCT is valued for its extensive historical database of human responses, which serves as a benchmark for comparison, owing to the consistent demographic profile of participants over many years and the systematic gathering of responses for evaluation ( Kaufman et al., 2008 ). The Alternate Uses Task (AUT) and the Remote Associates Test (RAT) are appreciated for their straightforward administration, scoring, and analysis. The Creative Achievement Test (CAT) is notable for its adaptability to specific fields, made possible by employing a panel of experts in relevant domains to assess creative works. Consequently, the CAT is particularly suited for evaluating creative outputs in historical contexts or significant “Big-C” creativity ( Kaufman et al., 2010 ). In contrast, the AUT and TTCT are more relevant for examining creativity in everyday, psychological, and professional contexts. As such, AUT and TTCT tests will establish a solid baseline for more complex design creativity studies employing more realistic design problems.
Electroencephalogram (EEG) signal analysis is a crucial component in the study of creativity whereby brain behavior associated with creativity tasks can be explored. Due to its advantages, EEG has emerged as one of the most suitable neuroimaging techniques for investigating brain activity during creativity tasks. Its affordability and suitability for studies involving physical movement, ease of recording and usage, and notably high temporal resolution make EEG a preferred choice in creativity research.
The dynamics during creative tasks are complex, nonlinear, and self-organized ( Nguyen and Zeng, 2012 ). It can thus be assumed that the brain could exhibits the similar characteristics, which shall be reflected in EEG signals. Capturing these complex and nonlinear patterns of brain behavior can be challenging for other neuroimaging methods ( Soroush et al., 2018 ).
In design creativity studies utilizing EEG, the susceptibility of EEG signals to noise and artifacts is a significant concern due to the accompanying physical movements inherent in these tasks. Consequently, EEG preprocessing becomes indispensable in ensuring data quality and reliability. Unfortunately, not all the included studies in this review have clearly explained their pre-processing and artifact removal approaches. There also exist some well-known preprocessing pipelines such as HAPPE ( Gabard-Durnam et al., 2018 ) which (in contrast to their high efficiency) have been rarely used in design creativity neurocognition ( Jia et al., 2021 ; Jia and Zeng, 2021 ). The included papers in our analysis have introduced various preprocessing methods, including wavelet analysis, frequency-based filtering, and independent component analysis (ICA) ( Beaty et al., 2017 ; Fink et al., 2018 ; Lou et al., 2020 ). The primary objective of preprocessing remains consistent: to obtain high-quality EEG data devoid of noise or artifacts while minimizing information loss. Achieving this goal is crucial for the accurate interpretation and analysis of EEG signals in design creativity research.
Design creativity studies often encompass a multitude of cognitive tasks occurring simultaneously or sequentially, rendering them ill-defined and unstructured. This complexity leads to the generation of unstructured EEG data, posing a challenge for subsequent analysis ( Zhao et al., 2020 ). Therefore, segmentation methods play a crucial role in classifying recorded EEG signals into distinct cognitive tasks, such as idea generation, idea evolution, and idea evaluation.
Several segmentation methods have been adopted, including the ones relying on Task-Related Potential (TRP) analysis and microstate analysis, followed by clustering techniques like K-means clustering ( Nguyen and Zeng, 2014a ; Nguyen et al., 2019 ; Zhao et al., 2020 ; Jia et al., 2021 ; Jia and Zeng, 2021 ; Rominger et al., 2022b ). These methods aid in organizing EEG data into meaningful segments corresponding to different phases of the design creativity process, facilitating more targeted and insightful analysis. In addition, they provide possibilities to look into a more comprehensive list of design-related cognitions implied in but not intended by conventional AUT and TTCT experiments.
While there are some uniform segmentation methods (such as the ones based on TRP) employing frequency-based methods. Nguyen et al. (2019) introduced a fully automatic dynamic method based on microstate analysis. Since then, microstate analysis has been used in several studies to categorize the EEG dynamics in design creativity tasks ( Jia et al., 2021 ; Jia and Zeng, 2021 ). Microstate analysis provides a novel method for EEG-based design creativity studies with the capabilities of high temporal resolution and topography results ( Yuan et al., 2012 ; Custo et al., 2017 ; Jia et al., 2021 ; Jia and Zeng, 2021 ).
The EEG data, after undergoing preprocessing, is directed to feature extraction, where relevant attributes are extracted to delve deeper into EEG dynamics and brain activity. These extracted features serve as the basis for conducting statistical analyses or employing machine learning algorithms.
In our review of the literature, we found that EEG frequency, time, and time-frequency analyses are the most commonly employed methods among the papers we considered. Specifically, the EEG alpha, beta, and gamma bands are often highlighted as critical indicators for studying brain dynamics in creativity and design creativity. Significant variations in the EEG bands have been observed during different stages of design creation tasks, including idea generation, idea evaluation, and idea elaboration ( Nguyen and Zeng, 2010 ; Liu et al., 2016 ; Rominger et al., 2019 ; Giannopulu et al., 2022 ; Lukačević et al., 2023 ; Mazza et al., 2023 ). For instance, the very first creativity studies used EEG alpha asymmetry to explore the relationship between creativity and left-hemisphere and right-hemisphere brain activity ( Martindale and Mines, 1975 ; Martindale and Hasenfus, 1978 ; Martindale et al., 1984 ). Other studies divided the EEG alpha band into lower (8–10 Hz) and upper alpha (10–13 Hz) and concluded that low alpha is more significant compared to the high EEG alpha band. Although the alpha band has been extensively explored by previous studies, several studies have also analyzed other EEG sub-bands such as beta, gamma, and delta and later concluded that these sub-bands are also significantly associated with creativity mechanisms, and can explain the differences between genders in different creativity experiments ( Razumnikova, 2004 ; Volf et al., 2010 ; Nair et al., 2020 ; Vieira et al., 2022a ).
Several studies have utilized Task-related power changes (TRP) to compare the EEG dynamics in different creativity tasks. TRP analysis is a high-temporal resolution method used to examine changes in brain activity associated with specific tasks or cognitive processes. In TRP analysis, the power of EEG signals, typically measured in terms of frequency bands (like alpha, beta, theta, etc.), is analyzed to identify how brain activity varies during the performance of a task compared to baseline or resting states. This method is particularly useful for understanding the dynamics of brain function as it allows researchers to pinpoint which areas of the brain are more active or less active during specific cognitive or motor tasks ( Rominger et al., 2022b ; Gubler et al., 2023 ). Reportedly, TRP has wide usage in EEG-based design creativity studies ( Jia et al., 2021 ; Jia and Zeng, 2021 ; Gubler et al., 2022 ).
Event-related synchronization (ERS) and de-synchronization (ERD) have also been reported to be effective in creativity studies ( Wang et al., 2017 ). ERD refers to a decrease in EEG power (in a specific frequency band) compared to a baseline state. The reduction in alpha power, for instance, is often interpreted as an increase in cortical activity. Conversely, ERS denotes an increase in EEG power. The increase in alpha power, for example, is associated with a relative decrease in cortical activity ( Doppelmayr et al., 2002 ; Babiloni et al., 2014 ). Researchers have concluded that these two indicators play a pivotal role in creativity studies as they are significantly correlated with brain dynamics during creativity tasks ( Srinivasan, 2007 ; Babiloni et al., 2014 ; Fink and Benedek, 2014 ).
Brain functional connectivity analysis, EEG source localization, brain topography maps, and event-related potentials analysis are other EEG processing methods which have been employed in a few studies ( Srinivasan, 2007 ; Dietrich and Kanso, 2010 ; Giannopulu et al., 2022 ; Kuznetsov et al., 2023 ). Considering that these methods have not been employed in several studies and with respect to their potential to provide insight into brain activity in transient modes or the correlations between the brain lobes, future studies are suggested to utilize such methods.
What was mentioned indicates that EEG frequency analysis is an effective approach for examining brain behavior in creativity and design creativity processes ( Fink and Neubauer, 2006 ; Nguyen and Zeng, 2010 ; Benedek et al., 2011 , 2014 ; Wang et al., 2017 ; Rominger et al., 2018 ; Vieira et al., 2022b ). Analyzing EEG channels in the time or frequency domains across various creativity tasks helps identify key channels contributing to these experiments. TRP and ERD/ERS are well-known EEG analysis methods widely applied in the included studies. Some studies have used other EEG sub-bands such as delta or gamma ( Boot et al., 2017 ; Stevens and Zabelina, 2020 ; Mazza et al., 2023 ). Besides these methods, other studies have utilized EEG connectivity and produced brain topography maps to explore different stages of design creativity. The final stage of EEG-based research involves statistical analysis and classification.
In statistical analysis, researchers examine EEG characteristics like power or alpha band amplitude to determine if there are notable differences during creativity tasks. Comparisons are made across different brain lobes and participants to identify which brain regions are more active during various stages of creativity. Techniques such as TRP, ERD, and ERS are scrutinized using statistical hypothesis testing to see if brain dynamics vary among participants or across different creativity tasks. Additionally, the relationship between EEG features and creativity scores is explored. For instance, researchers might investigate whether there is a link between EEG alpha power and creativity scores like originality and fluency. These statistical analyses can be conducted through either temporal or frequency EEG data.
In the classification phase, EEG data are classified according to different cognitive states of the brain. For example, EEG recordings might be classified based on the stages of creativity tasks, such as idea generation and idea evolution ( Hu et al., 2017 ; Stevens and Zabelina, 2020 ; Lloyd-Cox et al., 2022 ; Ahad et al., 2023 ; Şekerci et al., 2024 ). Except for a few studies which employed machine learning, other studies targeted EEG analysis and statistical methods. In these studies, the main objective is reported to be the classification of designers’ cognitive states, their emotional states, or the level of their creativity. In the included papers, traditional classifiers such as support vector machines and k-nearest neighbor have been employed. Modern deep learning approaches can be used in future studies to extract the hidden valuable information of EEG in design creativity states ( Jia, 2021 ). In open-ended loosely controlled creativity studies, where the phases of creativity are not clearly defined, clustering techniques are employed to categorize or segment EEG time intervals according to the corresponding creativity tasks ( Jia et al., 2021 ; Jia and Zeng, 2021 ). While loosely controlled design creativity studies results in more reliable and natural outcomes compared to strictly controlled ones, analyzing EEG signals in loosely controlled experiments is challenging as the recorded signals are not structured. Clustering methods are applied to microstate analysis to segment EEG signals into pre-defined states and have structured blocks that may align with certain cognitive functions ( Nguyen et al., 2019 ; Jia et al., 2021 ; Jia and Zeng, 2021 ). Therefore, statistical analysis, classification, and clustering form the core methods of data analysis in studies of creativity.
Table 2 represents EEG-based design studies with details about the number of participants, probable psychometric tests, experiment protocol, EEG analysis methods, and main findings. These studies are reported in this paper to highlight some of the differences between creativity and design creativity.
In addition to the studies reported in Table 2 , previous reviews and studies ( Srinivasan, 2007 ; Nguyen and Zeng, 2010 ; Lazar, 2018 ; Chrysikou and Gero, 2020 ; Hu and Shepley, 2022 ; Kim et al., 2022 ; Balters et al., 2023 ) can be found, which comprehensively reported approaches in design creativity neurocognition. Moreover, neurophysiological studies in design creativity are not limited to EEG or the components in Table 2 . For instance, in Liu et al. (2014) , EEG, heart rate (HR), and galvanic skin response (GSR) was used to detect the designer’s emotions in computer-aided design tasks. They determined the emotional states of CAD design tasks by processing CAD operators’ physiological signals and a fuzzy logic model. Aiello (2022) investigated the effects of external factors (such as light) and human ones on design processes, which also explored the association between the behavioral and neurophysiological responses in design creativity experiments. They employed ANOVA tests and found a significant correlation between neurophysiological recordings and daytime, participants’ stress, and their performance in terms of novelty and quality. They also recognized different patterns of brain dynamics corresponding to different kinds of performance measures. Montagna et al. ( Montagna and Candusso, n.d. ; Montagna and Laspia, 2018 ) analyzed brain behavior during the creative ideation process in the earliest phases of product development. In addition to EEG, they employed eye tracking to analyze the correlations between brain responses and eye movements. They utilized statistical analysis to recognize significant differences in brain hemispheres and lobes with respect to participants’ background, academic degree, and gender during the two modes of divergent and convergent thinking. Although some of their results are not consistent with those from the literature, these experiments shed light on the experiment design and provide insights and a framework for future experiments.
In the present paper, we reviewed EEG-based design creativity studies in terms of their main components such as participants, psychometrics, and creativity tasks. Numerous studies have delved into brain activities associated with design creativity tasks, examined from various angles. While Table 1 showcases studies centered on the Alternate Uses Test (AUT), and the Torrance Tests of Creative Thinking (TTCT), Table 2 summarizes the EEG-based studies on design and design creativity-related tasks. In this section, we are going to discuss the impact of some most important factors including participants, experiment design, and EEG recording and processing on EEG-based design creativity studies. Research gaps and open questions are thus presented based on the discussion.
4.1.1 psychometrics: do we have a population that we wished for.
Psychometric testing is crucial for participant selection, with participant screening often based merely on self-reported information or based on their educational background. Examining Tables 1 , 2 reveals that psychometrics are not frequently utilized in design creativity studies, indicating a notable gap in these investigations. Future research should consider establishing a standard set of psychometric tests to create comprehensive participant profiles, particularly focusing on intellectual capabilities ( Jauk et al., 2015 ; Ueno et al., 2015 ; Razumnikova, 2022 ). Taking a look at the studies which employed psychometrics, it could be inferred that there is a correlation between cognitive abilities such as intelligence and creativity ( Arden et al., 2010 ; Jung and Haier, 2013 ). The few psychometric tests employed primarily focus on determining and providing a cognitive profile, encompassing factors such as mood, stress, IQ, anxiety, memory, and intelligence. Notably, intelligence-related assessments are more commonly used compared to other tests. These psychometrics are subject to social masking according to which there is the possibility of unreliable self-report psychometrics being recorded in the experiments. These results might yield less accurate findings.
Participant numbers in these studies vary widely, indicating a broad spectrum of sample sizes in this research area. The populations in the studies varied in size, with most having around 40 participants, predominantly students. In the design of experiments, it is important to highlight that the sample size in the selected studies had a mean of 43.76 and a standard deviation of 20.50. It is worth noting that while some studies employed specific experimental designs to determine sample size, many did not have clear and specific criteria for sample size determination, leaving the ideal sample size in such studies an open question. Any studies determine their sample sizes using G* power ( Erdfelder et al., 1996 ; Faul et al., 2007 ), a prevalent tool for power analysis in social and behavioral research.
Initial investigations typically involved healthy adults to more thoroughly understand creativity’s underlying mechanisms. These foundational studies, conducted under optimal conditions, aimed to capture the essence of brain behavior during creative tasks. A handful of studies ( Ayoobi et al., 2022 ; Gubler et al., 2022 , 2023 ) have begun exploring creativity in the context of chronic pain or multiple sclerosis, but broader participant diversity remains an area for further research. Additionally, not all studies provided information on the ages of their participants. There is a noticeable gap in research involving older adults or those with health conditions, suggesting an area ripe for future exploration. Diversity in participant backgrounds, such as varying academic disciplines, could offer richer insights, given creativity’s multifaceted nature and its link to individual skills, affect, and perceived workload ( Yang et al., 2022 ). For instance, the creative approaches of students with engineering thinking might differ significantly from those with art thinking.
Gender was not examined in most included studies. There are just a few studies analyzing the effects of gender on creativity and design creativity ( Razumnikova, 2004 ; Volf et al., 2010 ; Vieira et al., 2020b , 2022a ; Gubler et al., 2022 ). There is a notable need for further investigation to fully understand the impact of gender on the brain dynamics of design creativity.
While the Alternate Uses Test (AUT) and the Torrance Tests of Creative Thinking (TTCT) are commonly used in creativity research, other tasks like the Remote Associate Task are also prevalent ( Schuler et al., 2019 ; Zhang et al., 2020 ). AUT and figural TTCT are particularly favored in design creativity experiments for their compatibility with design tasks, surpassing verbal or other creativity tasks in applicability ( Boot et al., 2017 ). When considering the creativity tasks in the studies, it is notable that the AUT is more frequently utilized than TTCT, owing to its simplicity and ease of quantifying creativity scores. In contrast, TTCT often requires subjective assessments and expert ratings for scoring ( Rogers et al., 2023 ). However, both TTCT and AUT have undergone modifications in several studies to investigate their potential characteristics further ( Nguyen and Zeng, 2014a ).
While the majority of studies have adhered to strictly controlled frameworks for their experiments, two studies ( Nguyen and Zeng, 2017 ; Nguyen et al., 2019 ; Jia, 2021 ; Jia et al., 2021 ) have adopted novel, loosely controlled approaches, which reportedly yield more natural and reliable results compared to the strictly controlled ones. The rigidity from strictly controlled creativity experiments can exert additional cognitive stress on participants, potentially impacting experimental outcomes. In contrast, the loosely controlled experiments are characterized as self-paced and open-ended, allowing participants ample time to comprehend the design problem, generate ideas, evaluate them, and iterate upon them as needed. Recent behavioral and theoretical research suggests that creativity is better explored within a loosely controlled framework, where sufficient flexibility and freedom are essential. This approach, which contrasts with the highly regulated nature of traditional creativity studies, aims to capture the unpredictable elements of design activities ( Zhao et al., 2020 ). Loosely controlled design studies offer a more realistic portrayal of the actual design process. In these settings, participants enjoy the liberty to develop ideas at their own pace, reflecting true design practices ( Jia, 2021 ). The flexibility in such experiments allows for a broader range of scenarios and outcomes, depending on the complexity and the designers’ understanding of the tests and processes. Prior research has confirmed the effectiveness of this approach, examining its validity from both neuropsychological and design perspectives. Despite their less rigid structure, these loosely controlled experiments are valid and consistent with previous studies. Loosely controlled creativity experiments allow researchers to engage with the nonlinear, ill-defined, open-ended, and intricate nature of creativity tasks. However, it is important to note that data collection and processing can pose challenges in loosely controlled experiments due to the resulting unstructured data. These challenges can be handled through machine learning and signal processing methods ( Zhao et al., 2020 ). For further details regarding the loosely controlled experiments, readers can refer to the provided references ( Zhao et al., 2020 ; Jia et al., 2021 ; Jia and Zeng, 2021 ; Zangeneh Soroush et al., 2024 ).
Participants are affected by external or internal sources during the experiments. Participants are asked not to have caffeine, alcohol, or other stimulating beverages. The influence of stimulants like caffeine, alcohol, and other substances on creative brain dynamics is another under-researched area. While some studies have investigated the impact of cognitive and affective stimulation on creativity [such as pain ( Gubler et al., 2022 , 2023 )], more extensive research is needed. The study concerning environmental factors like temperature, humidity, and lighting, has been noted to significantly influence creativity ( Kimura et al., 2023 ; Lee and Lee, 2023 ). Investigating these environmental aspects could lead to more conclusive findings. Understanding these variables related to participants and their surroundings will enable more holistic and comprehensive creativity studies.
As previously discussed and generally known in the neuroscience research community, EEG stands out as a simple and cost-effective biosignal with high temporal resolution, facilitating the exploration of microseconds of brain dynamics and providing detailed insights into neural activity, which was summarized in Balters and Steinert (2017) and Soroush et al. (2018) . However, despite its advantages in creativity experiments, EEG recording is prone to high levels of noise and artifacts due to its low amplitude and bandwidth ( Zangeneh Soroush et al., 2022 ). The inclusion of physical movements in design creativity experiments further increases the likelihood of artifacts such as movement and electrode replacement artifacts. Additionally, it is essential to acknowledge that EEG does have limitations, including relatively low spatial resolution. It also provides less information regarding brain behavior compared to other methods such as fMRI which provides detailed spatial brain activity.
In design creativity experiments, EEG preprocessing is an inseparable phase ensuring the quality of EEG data in design creativity experiments. Widely employed artifact removal methods include frequency-based filters and independent component analysis. Unfortunately, not all studies provide a detailed description of their artifact removal procedures ( Zangeneh Soroush et al., 2022 ), compromising the reproducibility of the findings. Moreover, while there are standard evaluation metrics for assessing the quality of preprocessed EEG data, these metrics are often overlooked or not discussed in the included papers. It is essential to note that EEG preprocessing extends beyond artifact removal to include the segmentation of unstructured EEG data into well-defined structured EEG windows each of which corresponds to a specific cognitive task. This presents a challenge, particularly in loosely controlled experiments where the cognitive activities of designers during drawing tasks may not be clearly delineated since design tasks are recursive, nonlinear, self-paced, and complex, further complicating the segmentation process ( Nguyen and Zeng, 2012 ; Yang et al., 2022 ).
EEG analysis methods in creativity research predominantly utilize frequency-based analysis, with the alpha band (particularly the upper alpha band, 10–13 Hz) being a key focus due to its effectiveness in capturing various phases of creativity, including divergent and convergent thinking. Across studies, a consistent pattern of decreases in EEG power during design creativity compared to rest has been observed in the low-frequency delta and theta bands, as well as in the lower and upper alpha bands in bilateral frontal, central, and occipital brain regions ( Fink and Benedek, 2014 , 2021 ). This phenomenon, known as task-related desynchronization (TRD), is a common finding in EEG analysis during creativity tasks ( Jausovec and Jausovec, 2000 ; Pidgeon et al., 2016 ). A recurrent observation in numerous studies is the link between alpha band activity and creative cognition, particularly original idea generation and divergent thinking. Alpha synchronization, especially in the right hemisphere and frontal regions, is commonly associated with creative tasks and the generation of original ideas ( Rominger et al., 2022a ). Task-Related Power (TRP) analysis in the alpha band is widely used to decipher creativity-related brain activities. Creativity tasks typically result in increased alpha power, with more innovative responses correlating with stronger alpha synchronization in the posterior cortices. The TRP dynamics, marked by an initial rise, subsequent fall, and a final increase in alpha power, reflect the cognitive processes underlying creative ideation ( Rominger et al., 2018 ). Creativity is influenced by both cognitive processes and affective states, with studies showing that cognitive and affective interventions can enhance creative cognition through stronger prefrontal alpha activity. Different creative phases (e.g., idea generation, evolution, evaluation) exhibit unique EEG activity patterns. For instance, idea evolution is linked to a smaller decrease in lower alpha power, indicating varying attentional demands ( Fink and Benedek, 2014 , 2021 ; Rominger et al., 2019 , 2022a ; Jia and Zeng, 2021 ).
Hemispheric asymmetry plays a crucial role in creativity, with increased alpha power in the right hemisphere linked to the generation of more novel ideas. This asymmetry intensifies as the creative process unfolds. The frontal cortex, particularly through alpha synchronization, is frequently involved in creative cognition and idea evaluation, indicating a role in top-down control and internal attention ( Benedek et al., 2014 ). The parietal cortex, especially the right parietal cortex, is significant for focused internal attention during creative tasks ( Razumnikova, 2004 ; Benedek et al., 2011 , 2014 ).
EEG phase locking is another frequently employed analysis method. Most studies have focused on EEG coherence, EEG power and frequency analysis, brain asymmetry methods (hemispheric lateralization), and EEG temporal methods ( Rominger et al., 2020 ). However, creativity, being a higher-order, complex, nonlinear, and non-stationary cognitive task, suggests that linear and deterministic methods like frequency-based analysis might not fully capture its intricacies. This raises the possibility of incorporating alternative, specifically nonlinear EEG processing methods, which, to our knowledge, have been sparingly used in creativity research ( Stevens and Zabelina, 2020 ; Jia and Zeng, 2021 ). Additional analyses such as wavelet analysis, brain source separation, and source localization hold promise for future research endeavors in this domain.
As mentioned in the previous section, most studies have considered participants without their cognitive profile and characteristics. In addition, the included studies have chosen two main approaches including traditional statistical analysis and machine learning methods ( Goel, 2014 ; Stevens and Zabelina, 2020 ; Fink and Benedek, 2021 ). It should be noted that almost all of the included studies have employed the traditional statistical methods to examine their hypotheses or explore the differences between participants performing creativity tasks ( Fink and Benedek, 2014 , 2021 ; Rominger et al., 2019 , 2022a ; Stevens and Zabelina, 2020 ; Jia and Zeng, 2021 ).
Individual differences, such as intelligence, personality traits, and humor comprehension, also affect EEG patterns during creative tasks. For example, individuals with higher monitoring skills and creative potential exhibit distinct alpha power changes during creative ideation and evaluation ( Perchtold-Stefan et al., 2020 ). The diversity in creativity tasks (e.g., AUT, TTCT, verbal tasks) and EEG analysis methods (e.g., ERD/ERS, TRP, phase locking) used in studies highlights the methodological variety in this field, emphasizing the complexity of creativity research and the necessity for multiple approaches to fully grasp its neurocognitive mechanisms ( Goel, 2014 ; Gero and Milovanovic, 2020 ; Rominger et al., 2020 ; Fink and Benedek, 2021 ; Jia and Zeng, 2021 ).
In statistical analysis, studies often assess the differences in extracted features across different categories. For instance, in a study ( Gopan et al., 2022 ), various features, including nonlinear and temporal features, are extracted from single-channel EEG data to evaluate levels of Visual Creativity during sketching tasks. This involves comparing different groups within the experimental population based on specific features. Notably, the traditional statistical analyses not only provide insights into differences between experimental groups but also offer valuable information for machine learning methods ( Stevens and Zabelina, 2020 ). In another study ( Gubler et al., 2023 ), researchers conducted statistical analysis on frequency-based features to explore the impact of experimentally induced pain on creative ideation among female participants using an adaptation of the Alternate Uses Task (AUT). The analysis involved examining EEG features across channels and brain hemispheres under pain and pain-free conditions. Similarly, in another study ( Benedek et al., 2014 ), researchers conducted statistical analysis on EEG alpha power to investigate the functional significance of alpha power increases in the right parietal cortex, which reflects focused internal attention. They found that the Alternate Uses Task (AUT) inherently relies on internal attention (sensory-independence). Specifically, enforcing internal attention led to increased alpha power only in tasks requiring sensory intake but not in tasks requiring sensory independence. Moreover, sensory-independent tasks generally exhibited higher task-related alpha power levels than sensory intake tasks across both experimental conditions ( Benedek et al., 2011 , 2014 ).
Although most studies have employed statistical measures and analyses to investigate brain dynamics in a limited number of participants, there is a considerable lack of within-subjects and between-subjects analyses ( Rominger et al., 2022b ). There exist several studies which differentiate the brain dynamics of expert and novice designers or engineering students in different fields ( Vieira et al., 2020c , d ); however, more investigations with a larger number of participants are required.
While statistical approaches are commonly employed in EEG-based design creativity studies, there is a notable absence of machine learning methods within this domain. Among the included studies, only one ( Gopan et al., 2022 ) utilized machine learning techniques. In this study, statistical and nonlinear features were extracted from preprocessed EEG signals to classify EEG data into predefined cognitive tasks based on EEG characteristics. The study employed machine learning algorithms such as Long Short-Term Memory (LSTM), Support Vector Machines (SVM), and k-Nearest Neighbor (KNN) to classify EEG samples. These methods were utilized to enhance the understanding of the relationship between EEG signals and cognitive tasks, offering a promising avenue for further exploration in EEG-based design creativity research ( Stevens and Zabelina, 2020 ).
In this review, we aimed to empower readers to decide on experiments, EEG markers, feature extraction algorithms, and processing methods based on their study objectives, requirements, and limitations. However, it is essential to acknowledge that this review, while valuable in exploring EEG-based creativity and design creativity, has certain limitations which are summarized below:
1. Our review focuses on just the neuroscientific aspects of prior creativity and design creativity studies. Design methodologies and creativity models should be reviewed in other studies.
2. Included studies have employed only a limited number of adult participants with no mental or physical disorder.
3. Most studies have utilized fNIRS or EEG as they are more suitable for design creativity experiments, but we only focused on EEG based studies.
According to what was discussed above, it is obvious that EEG-based design creativity studies have been quite recently introduced to the field of design. This indicates that research gaps and open questions should be addressed for future studies. The following provides ten open questions we extracted from this review.
1. What constitutes an optimal protocol for participant selection, creativity task design, and procedural guidelines in EEG-based design creativity research?
2. How can we reconcile inconsistencies arising from variations in creativity tests and procedures across different studies? Furthermore, how can we address disparities between findings in EEG and fMRI studies?
3. What notable disparities exist in brain dynamics when comparing different creativity tests within the realm of design creativity?
4. In what ways can additional physiological markers, such as ECG and eye tracking, contribute to understanding neurocognition in design creativity?
5. How can alternative EEG processing methods beyond frequency-based analysis enhance the study of brain behavior during design creativity tasks?
6. What strategies can be employed to integrate combinational methods like EEG-fMRI to investigate design creativity?
7. How can the utilization of advanced wearable recording systems facilitate the implementation of more naturalistic and ecologically valid design creativity experiments?
8. What are the most effective approaches for transforming unstructured data into organized formats in loosely controlled creativity experiments?
9. What neural mechanisms are associated with design creativity in various mental and physical disorders?
10. In what ways can the application of advanced EEG processing methods offer deeper insights into the neurocognitive aspects of design creativity?
Design creativity stands as one of the most intricate high-order cognitive tasks, encompassing both mental and physical activities. It is a domain where design and creativity are intertwined, each representing a complex cognitive process. The human brain, an immensely sophisticated biological system, undergoes numerous intricate dynamics to facilitate creative abilities. The evolution of neuroimaging techniques, computational technologies, and machine learning has now enabled us to delve deeper into the brain behavior in design creativity tasks.
This literature review aims to scrutinize and highlight pivotal, and foundational research in this area. Our goal is to provide essential, comprehensive, and practical insights for future investigators in this field. We employed the snowball search method to reach the final set of papers which met our inclusion criteria. In this review, more than 1,500 studies were monitored and assessed as EEG-based creativity and design creativity studies. We reviewed over 120 studies with respect to their experimental details including participants, (design) creativity tasks, EEG analyses methods, and their main findings. Our review reports the most important experimental details of EEG-based studies and it also highlights research gaps, potential future trends, and promising avenues for future investigations.
MZ: Formal analysis, Investigation, Writing – original draft, Writing – review & editing. YZ: Conceptualization, Funding acquisition, Methodology, Project administration, Resources, Supervision, Writing – review & editing.
The author(s) declare financial support was received for the research, authorship, and/or publication of this article. This work was supported by NSERC Discovery Grant (RGPIN-2019-07048), NSERC CRD Project (CRDPJ514052-17), and NSERC Design Chairs Program (CDEPJ 485989-14).
The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
Abraham, A., Rutter, B., Bantin, T., and Hermann, C. (2018). Creative conceptual expansion: a combined fMRI replication and extension study to examine individual differences in creativity. Neuropsychologia 118, 29–39. doi: 10.1016/j.neuropsychologia.2018.05.004
Crossref Full Text | Google Scholar
Agnoli, S., Zanon, M., Mastria, S., Avenanti, A., and Corazza, G. E. (2020). Predicting response originality through brain activity: an analysis of changes in EEG alpha power during the generation of alternative ideas. NeuroImage 207:116385. doi: 10.1016/j.neuroimage.2019.116385
PubMed Abstract | Crossref Full Text | Google Scholar
Agnoli, S., Zenari, S., Mastria, S., and Corazza, G. E. (2021). How do you feel in virtual environments? The role of emotions and openness trait over creative performance. Creativity 8, 148–164. doi: 10.2478/ctra-2021-0010
Ahad, M. T., Hartog, T., Alhashim, A. G., Marshall, M., and Siddique, Z. (2023). Electroencephalogram experimentation to understand creativity of mechanical engineering students. ASME Open J. Eng. 2:21005. doi: 10.1115/1.4056473
Aiello, L. (2022). Time of day and background: How they affect designers neurophysiological and behavioural performance in divergent thinking. Polytechnic of Turin.
Google Scholar
Almeida, L. S., Prieto, L. P., Ferrando, M., Oliveira, E., and Ferrándiz, C. (2008). Torrance test of creative thinking: the question of its construct validity. Think. Skills Creat. 3, 53–58. doi: 10.1016/j.tsc.2008.03.003
Arden, R., Chavez, R. S., Grazioplene, R., and Jung, R. E. (2010). Neuroimaging creativity: a psychometric view. Behav. Brain Res. 214, 143–156. doi: 10.1016/j.bbr.2010.05.015
Ayoobi, F., Charmahini, S. A., Asadollahi, Z., Solati, S., Azin, H., Abedi, P., et al. (2022). Divergent and convergent thinking abilities in multiple sclerosis patients. Think. Skills Creat. 45:101065. doi: 10.1016/j.tsc.2022.101065
Babiloni, C., Del Percio, C., Arendt-Nielsen, L., Soricelli, A., Romani, G. L., Rossini, P. M., et al. (2014). Cortical EEG alpha rhythms reflect task-specific somatosensory and motor interactions in humans. Clin. Neurophysiol. 125, 1936–1945. doi: 10.1016/j.clinph.2014.04.021
Baillet, S., Mosher, J. C., and Leahy, R. M. (2001). Electromagnetic brain mapping. IEEE Signal Process. Mag. 18, 14–30. doi: 10.1109/79.962275
Balters, S., and Steinert, M. (2017). Capturing emotion reactivity through physiology measurement as a foundation for affective engineering in engineering design science and engineering practices. J. Intell. Manuf. 28, 1585–1607. doi: 10.1007/s10845-015-1145-2
Balters, S., Weinstein, T., Mayseless, N., Auernhammer, J., Hawthorne, G., Steinert, M., et al. (2023). Design science and neuroscience: a systematic review of the emergent field of design neurocognition. Des. Stud. 84:101148. doi: 10.1016/j.destud.2022.101148
Beaty, R. E., Christensen, A. P., Benedek, M., Silvia, P. J., and Schacter, D. L. (2017). Creative constraints: brain activity and network dynamics underlying semantic interference during idea production. NeuroImage 148, 189–196. doi: 10.1016/j.neuroimage.2017.01.012
Benedek, M., Bergner, S., Könen, T., Fink, A., and Neubauer, A. C. (2011). EEG alpha synchronization is related to top-down processing in convergent and divergent thinking. Neuropsychologia 49, 3505–3511. doi: 10.1016/j.neuropsychologia.2011.09.004
Benedek, M., and Fink, A. (2019). Toward a neurocognitive framework of creative cognition: the role of memory, attention, and cognitive control. Curr. Opin. Behav. Sci. 27, 116–122. doi: 10.1016/j.cobeha.2018.11.002
Benedek, M., Schickel, R. J., Jauk, E., Fink, A., and Neubauer, A. C. (2014). Alpha power increases in right parietal cortex reflects focused internal attention. Neuropsychologia 56, 393–400. doi: 10.1016/j.neuropsychologia.2014.02.010
Bhattacharya, J., and Petsche, H. (2005). Drawing on mind’s canvas: differences in cortical integration patterns between artists and non-artists. Hum. Brain Mapp. 26, 1–14. doi: 10.1002/hbm.20104
Boden, M. A. (2004). The creative mind: Myths and mechanisms . London and New York: Routledge.
Boot, N., Baas, M., Mühlfeld, E., de Dreu, C. K. W., and van Gaal, S. (2017). Widespread neural oscillations in the delta band dissociate rule convergence from rule divergence during creative idea generation. Neuropsychologia 104, 8–17. doi: 10.1016/j.neuropsychologia.2017.07.033
Borgianni, Y., and Maccioni, L. (2020). Review of the use of neurophysiological and biometric measures in experimental design research. Artif. Intell. Eng. Des. Anal. Manuf. 34, 248–285. doi: 10.1017/S0890060420000062
Braun, V., and Clarke, V. (2012). “Thematic analysis” in APA handbook of research methods in psychology, Vol 2: Research designs: Quantitative, qualitative, neuropsychological, and biological . eds. H. Cooper, P. M. Camic, D. L. Long, A. T. Panter, D. Rindskopf, and K. J. Sher (American Psychological Association), 57–71.
Camarda, A., Salvia, É., Vidal, J., Weil, B., Poirel, N., Houdé, O., et al. (2018). Neural basis of functional fixedness during creative idea generation: an EEG study. Neuropsychologia 118, 4–12. doi: 10.1016/j.neuropsychologia.2018.03.009
Chang, Y., Kao, J.-Y., and Wang, Y.-Y. (2022). Influences of virtual reality on design creativity and design thinking. Think. Skills Creat. 46:101127. doi: 10.1016/j.tsc.2022.101127
Choi, J. W., and Kim, K. H. (2018). Methods for functional connectivity analysis. in Computational EEG analysis. Biological and medical physics, biomedical engineering . ed. I. M. CH (Singapore: Springer).
Chrysikou, E. G., and Gero, J. S. (2020). Using neuroscience techniques to understand and improve design cognition. AIMS Neurosci. 7, 319–326. doi: 10.3934/Neuroscience.2020018
Cropley, A. J. (2000). Defining and measuring creativity: are creativity tests worth using? Roeper Rev. 23, 72–79. doi: 10.1080/02783190009554069
Cropley, D. H. (2015a). “Chapter 2 – The importance of creativity in engineering” in Creativity in engineering . ed. D. H. Cropley (London: Academic Press), 13–34.
Cropley, D. H. (2015b). “Chapter 3 – Phases: creativity and the design process” in Creativity in engineering . ed. D. H. Cropley (London: Academic Press), 35–61.
Custo, A., Van De Ville, D., Wells, W. M., Tomescu, M. I., Brunet, D., and Michel, C. M. (2017). Electroencephalographic resting-state networks: source localization of microstates. Brain Connect. 7, 671–682. doi: 10.1089/brain.2016.0476
Danko, S. G., Shemyakina, N. V., Nagornova, Z. V., and Starchenko, M. G. (2009). Comparison of the effects of the subjective complexity and verbal creativity on EEG spectral power parameters. Hum. Physiol. 35, 381–383. doi: 10.1134/S0362119709030153
Dietrich, A., and Kanso, R. (2010). A review of EEG, ERP, and neuroimaging studies of creativity and insight. Psychol. Bull. 136, 822–848. doi: 10.1037/a0019749
Doppelmayr, M., Klimesch, W., Stadler, W., Pöllhuber, D., and Heine, C. (2002). EEG alpha power and intelligence. Intelligence 30, 289–302. doi: 10.1016/S0160-2896(01)00101-5
Erdfelder, E., Faul, F., and Buchner, A. (1996). GPOWER: a general power analysis program. Behav. Res. Methods Instrum. Comput. 28, 1–11. doi: 10.3758/BF03203630
Eymann, V., Beck, A.-K., Jaarsveld, S., Lachmann, T., and Czernochowski, D. (2022). Alpha oscillatory evidence for shared underlying mechanisms of creativity and fluid intelligence above and beyond working memory-related activity. Intelligence 91:101630. doi: 10.1016/j.intell.2022.101630
Faul, F., Erdfelder, E., Lang, A.-G., and Buchner, A. (2007). G*power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav. Res. Methods 39, 175–191. doi: 10.3758/BF03193146
Fink, A., and Benedek, M. (2014). EEG alpha power and creative ideation. Neurosci. Biobehav. Rev. 44, 111–123. doi: 10.1016/j.neubiorev.2012.12.002
Fink, A., and Benedek, M. (2021). The neuroscience of creativity. e-Neuroforum 25, 231–240. doi: 10.1515/nf-2019-0006
Fink, A., Benedek, M., Koschutnig, K., Papousek, I., Weiss, E. M., Bagga, D., et al. (2018). Modulation of resting-state network connectivity by verbal divergent thinking training. Brain Cogn. 128, 1–6. doi: 10.1016/j.bandc.2018.10.008
Fink, A., Grabner, R. H., Benedek, M., Reishofer, G., Hauswirth, V., Fally, M., et al. (2009a). The creative brain: investigation of brain activity during creative problem solving by means of EEG and fMRI. Hum. Brain Mapp. 30, 734–748. doi: 10.1002/hbm.20538
Fink, A., Graif, B., and Neubauer, A. C. (2009b). Brain correlates underlying creative thinking: EEG alpha activity in professional vs. novice dancers. NeuroImage 46, 854–862. doi: 10.1016/j.neuroimage.2009.02.036
Fink, A., and Neubauer, A. C. (2006). EEG alpha oscillations during the performance of verbal creativity tasks: differential effects of sex and verbal intelligence. Int. J. Psychophysiol. 62, 46–53. doi: 10.1016/j.ijpsycho.2006.01.001
Fink, A., and Neubauer, A. C. (2008). Eysenck meets Martindale: the relationship between extraversion and originality from the neuroscientific perspective. Personal. Individ. Differ. 44, 299–310. doi: 10.1016/j.paid.2007.08.010
Fink, A., Schwab, D., and Papousek, I. (2011). Sensitivity of EEG upper alpha activity to cognitive and affective creativity interventions. Int. J. Psychophysiol. 82, 233–239. doi: 10.1016/j.ijpsycho.2011.09.003
Gabard-Durnam, L. J., Mendez Leal, A. S., Wilkinson, C. L., and Levin, A. R. (2018). The Harvard automated processing pipeline for electroencephalography (HAPPE): standardized processing software for developmental and high-Artifact data. Front. Neurosci. 12:97. doi: 10.3389/fnins.2018.00097
Gao, M., Zhang, D., Wang, Z., Liang, B., Cai, Y., Gao, Z., et al. (2017). Mental rotation task specifically modulates functional connectivity strength of intrinsic brain activity in low frequency domains: a maximum uncertainty linear discriminant analysis. Behav. Brain Res. 320, 233–243. doi: 10.1016/j.bbr.2016.12.017
Gero, J. S. (1990). Design prototypes: a knowledge representation schema for design. AI Mag. 11:26. doi: 10.1609/aimag.v11i4.854
Gero, J. S. (1994). “Introduction: creativity and design” in Artificial intelligence and creativity: An interdisciplinary approach . ed. T. Dartnall (Netherlands: Springer), 259–267.
Gero, J. S. (1996). Creativity, emergence and evolution in design. Knowl. Based Syst. 9, 435–448. doi: 10.1016/S0950-7051(96)01054-4
Gero, J. (2011). Design creativity 2010 doi: 10.1007/978-0-85729-224-7
Gero, J. S. (2020). Nascent directions for design creativity research. Int. J. Des. Creat. Innov. 8, 144–146. doi: 10.1080/21650349.2020.1767885
Gero, J. S., and Milovanovic, J. (2020). A framework for studying design thinking through measuring designers’ minds, bodies and brains. Design Sci. 6:e19. doi: 10.1017/dsj.2020.15
Giannopulu, I., Brotto, G., Lee, T. J., Frangos, A., and To, D. (2022). Synchronised neural signature of creative mental imagery in reality and augmented reality. Heliyon 8:e09017. doi: 10.1016/j.heliyon.2022.e09017
Goel, V. (2014). Creative brains: designing in the real world. Front. Hum. Neurosci. 8, 1–14. doi: 10.3389/fnhum.2014.00241
Göker, M. H. (1997). The effects of experience during design problem solving. Des. Stud. 18, 405–426. doi: 10.1016/S0142-694X(97)00009-4
Gopan, K. G., Reddy, S. V. R. A., Rao, M., and Sinha, N. (2022). Analysis of single channel electroencephalographic signals for visual creativity: a pilot study. Biomed. Signal Process. Control. 75:103542. doi: 10.1016/j.bspc.2022.103542
Grabner, R. H., Fink, A., and Neubauer, A. C. (2007). Brain correlates of self-rated originality of ideas: evidence from event-related power and phase-locking changes in the EEG. Behav. Neurosci. 121, 224–230. doi: 10.1037/0735-7044.121.1.224
Gubler, D. A., Rominger, C., Grosse Holtforth, M., Egloff, N., Frickmann, F., Goetze, B., et al. (2022). The impact of chronic pain on creative ideation: an examination of the underlying attention-related psychophysiological mechanisms. Eur. J. Pain (United Kingdom) 26, 1768–1780. doi: 10.1002/ejp.2000
Gubler, D. A., Rominger, C., Jakob, D., and Troche, S. J. (2023). How does experimentally induced pain affect creative ideation and underlying attention-related psychophysiological mechanisms? Neuropsychologia 183:108514. doi: 10.1016/j.neuropsychologia.2023.108514
Guilford, J. P. (1959). “Traits of creativity” in Creativity and its cultivation . ed. H. H. Anderson (New York: Harper & Row), 142–161.
Guilford, J. P. (1967). The nature of human intelligence . New York, NY, US: McGraw-Hill.
Guzik, E. E., Byrge, C., and Gilde, C. (2023). The originality of machines: AI takes the Torrance test. Journal of Creativity 33:100065. doi: 10.1016/j.yjoc.2023.100065
Haner, U.-E. (2005). Spaces for creativity and innovation in two established organizations. Creat. Innov. Manag. 14, 288–298. doi: 10.1111/j.1476-8691.2005.00347.x
Hao, N., Ku, Y., Liu, M., Hu, Y., Bodner, M., Grabner, R. H., et al. (2016). Reflection enhances creativity: beneficial effects of idea evaluation on idea generation. Brain Cogn. 103, 30–37. doi: 10.1016/j.bandc.2016.01.005
Hartog, T. (2021). EEG investigations of creativity in engineering and engineering design. shareok.org . Available at: https://shareok.org/handle/11244/329532
Hartog, T., Marshall, M., Alhashim, A., Ahad, M. T., et al. (2020). Work in Progress: using neuro-responses to understand creativity, the engineering design process, and concept generation. Paper Presented at …. Available at: https://par.nsf.gov/biblio/10208519
Hetzroni, O., Agada, H., and Leikin, M. (2019). Creativity in autism: an examination of general and mathematical creative thinking among children with autism Spectrum disorder and children with typical development. J. Autism Dev. Disord. 49, 3833–3844. doi: 10.1007/s10803-019-04094-x
Hu, W.-L., Booth, J. W., and Reid, T. (2017). The relationship between design outcomes and mental states during ideation. J. Mech. Des. 139:51101. doi: 10.1115/1.4036131
Hu, Y., Ouyang, J., Wang, H., Zhang, J., Liu, A., Min, X., et al. (2022). Design meets neuroscience: an electroencephalogram study of design thinking in concept generation phase. Front. Psychol. 13:832194. doi: 10.3389/fpsyg.2022.832194
Hu, L., and Shepley, M. M. C. (2022). Design meets neuroscience: a preliminary review of design research using neuroscience tools. J. Inter. Des. 47, 31–50. doi: 10.1111/joid.12213
Japardi, K., Bookheimer, S., Knudsen, K., Ghahremani, D. G., and Bilder, R. M. (2018). Functional magnetic resonance imaging of divergent and convergent thinking in big-C creativity. Neuropsychologia 118, 59–67. doi: 10.1016/j.neuropsychologia.2018.02.017
Jauk, E., Benedek, M., and Neubauer, A. C. (2012). Tackling creativity at its roots: evidence for different patterns of EEG alpha activity related to convergent and divergent modes of task processing. Int. J. Psychophysiol. 84, 219–225. doi: 10.1016/j.ijpsycho.2012.02.012
Jauk, E., Neubauer, A. C., Dunst, B., Fink, A., and Benedek, M. (2015). Gray matter correlates of creative potential: a latent variable voxel-based morphometry study. NeuroImage 111, 312–320. doi: 10.1016/j.neuroimage.2015.02.002
Jausovec, N., and Jausovec, K. (2000). EEG activity during the performance of complex mental problems. Int. J. Psychophysiol. 36, 73–88. doi: 10.1016/S0167-8760(99)00113-0
Jia, W. (2021). Investigating neurocognition in design creativity under loosely controlled experiments supported by EEG microstate analysis [Concordia University]. Available at: https://spectrum.library.concordia.ca/id/eprint/988724/
Jia, W., von Wegner, F., Zhao, M., and Zeng, Y. (2021). Network oscillations imply the highest cognitive workload and lowest cognitive control during idea generation in open-ended creation tasks. Sci. Rep. 11:24277. doi: 10.1038/s41598-021-03577-1
Jia, W., and Zeng, Y. (2021). EEG signals respond differently to idea generation, idea evolution and evaluation in a loosely controlled creativity experiment. Sci. Rep. 11:2119. doi: 10.1038/s41598-021-81655-0
Jung, R. E., and Haier, R. J. (2013). “Creativity and intelligence: brain networks that link and differentiate the expression of genius” in Neuroscience of creativity . eds. O. Vartanian, A. S. Bristol, and J. C. Kaufman (Cambridge, MA: MIT Press). 233–254. (Accessed 18 June 2024).
Jung, R. E., and Vartanian, O. (Eds.). (2018). The Cambridge handbook of the neuroscience of creativity . Cambridge: Cambridge University Press.
Kaufman, J. C., Beghetto, R. A., Baer, J., and Ivcevic, Z. (2010). Creativity polymathy: what Benjamin Franklin can teach your kindergartener. Learn. Individ. Differ. 20, 380–387. doi: 10.1016/j.lindif.2009.10.001
Kaufman, J. C., John Baer, J. C. C., and Sexton, J. D. (2008). A comparison of expert and nonexpert Raters using the consensual assessment technique. Creat. Res. J. 20, 171–178. doi: 10.1080/10400410802059929
Kaufman, J. C., and Sternberg, R. J. (Eds.). (2010). The Cambridge handbook of creativity. Cambridge University Press.
Kim, N., Chung, S., and Kim, D. I. (2022). Exploring EEG-based design studies: a systematic review. Arch. Des. Res. 35, 91–113. doi: 10.15187/adr.2022.11.35.4.91
Kimura, T., Mizumoto, T., Torii, Y., Ohno, M., Higashino, T., and Yagi, Y. (2023). Comparison of the effects of indoor and outdoor exercise on creativity: an analysis of EEG alpha power. Front. Psychol. 14:1161533. doi: 10.3389/fpsyg.2023.1161533
Klimesch, W. (1999). EEG alpha and theta oscillations reflect cognitive and memory performance: a review and analysis. Brain Res. Rev. 29, 169–195. doi: 10.1016/s0165-0173(98)00056-3
Klimesch, W., Doppelmayr, M., Russegger, H., Pachinger, T., and Schwaiger, J. (1998). Induced alpha band power changes in the human EEG and attention. Neurosci. Lett. 244, 73–76. doi: 10.1016/S0304-3940(98)00122-0
Kruk, K. A., Aravich, P. F., Deaver, S. P., and deBeus, R. (2014). Comparison of brain activity during drawing and clay sculpting: a preliminary qEEG study. Art Ther. 31, 52–60. doi: 10.1080/07421656.2014.903826
Kuznetsov, I., Kozachuk, N., Kachynska, T., Zhuravlov, O., Zhuravlova, O., and Rakovets, O. (2023). Inner speech as a brain mechanism for preconditioning creativity process. East Eur. J. Psycholinguist. 10, 136–151. doi: 10.29038/eejpl.2023.10.1.koz
Lazar, L. (2018). The cognitive neuroscience of design creativity. J. Exp. Neurosci. 12:117906951880966. doi: 10.1177/1179069518809664
Lee, J. H., and Lee, S. (2023). Relationships between physical environments and creativity: a scoping review. Think. Skills Creat. 48:101276. doi: 10.1016/j.tsc.2023.101276
Leikin, M. (2013). The effect of bilingualism on creativity: developmental and educational perspectives. Int. J. Biling. 17, 431–447. doi: 10.1177/1367006912438300
Li, S., Becattini, N., and Cascini, G. (2021). Correlating design performance to EEG activation: early evidence from experimental data. Proceedings of the Design Society. Available at: https://www.cambridge.org/core/journals/proceedings-of-the-design-society/article/correlating-design-performance-to-eeg-activation-early-evidence-from-experimental-data/8F4FCB64135209CAD9B97C1433E7CB99
Liang, C., Chang, C. C., and Liu, Y. C. (2019). Comparison of the cerebral activities exhibited by expert and novice visual communication designers during idea incubation. Int. J. Des. Creat. Innov. 7, 213–236. doi: 10.1080/21650349.2018.1562995
Liberati, A., Altman, D. G., Tetzlaff, J., Mulrow, C., Gøtzsche, P. C., Ioannidis, J. P. A., et al. (2009). The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. J. Clin. Epidemiol. 62, e1–e34. doi: 10.1016/j.jclinepi.2009.06.006
Liu, L., Li, Y., Xiong, Y., Cao, J., and Yuan, P. (2018). An EEG study of the relationship between design problem statements and cognitive behaviors during conceptual design. Artif. Intell. Eng. Des. Anal. Manuf. 32, 351–362. doi: 10.1017/S0890060417000683
Liu, L., Nguyen, T. A., Zeng, Y., and Hamza, A. B. (2016). Identification of relationships between electroencephalography (EEG) bands and design activities. Volume 7: doi: 10.1115/DETC2016-59104
Liu, Y., Ritchie, J. M., Lim, T., Kosmadoudi, Z., Sivanathan, A., and Sung, R. C. W. (2014). A fuzzy psycho-physiological approach to enable the understanding of an engineer’s affect status during CAD activities. Comput. Aided Des. 54, 19–38. doi: 10.1016/j.cad.2013.10.007
Lloyd-Cox, J., Chen, Q., and Beaty, R. E. (2022). The time course of creativity: multivariate classification of default and executive network contributions to creative cognition over time. Cortex 156, 90–105. doi: 10.1016/j.cortex.2022.08.008
Lou, S., Feng, Y., Li, Z., Zheng, H., and Tan, J. (2020). An integrated decision-making method for product design scheme evaluation based on cloud model and EEG data. Adv. Eng. Inform. 43:101028. doi: 10.1016/j.aei.2019.101028
Lukačević, F., Becattini, N., Perišić, M. M., and Škec, S. (2023). Differences in engineers’ brain activity when CAD modelling from isometric and orthographic projections. Sci. Rep. 13:9726. doi: 10.1038/s41598-023-36823-9
Martindale, C., and Hasenfus, N. (1978). EEG differences as a function of creativity, stage of the creative process, and effort to be original. Biol. Psychol. 6, 157–167. doi: 10.1016/0301-0511(78)90018-2
Martindale, C., Hines, D., Mitchell, L., and Covello, E. (1984). EEG alpha asymmetry and creativity. Personal. Individ. Differ. 5, 77–86. doi: 10.1016/0191-8869(84)90140-5
Martindale, C., and Mines, D. (1975). Creativity and cortical activation during creative, intellectual and eeg feedback tasks. Biol. Psychol. 3, 91–100. doi: 10.1016/0301-0511(75)90011-3
Mastria, S., Agnoli, S., Zanon, M., Acar, S., Runco, M. A., and Corazza, G. E. (2021). Clustering and switching in divergent thinking: neurophysiological correlates underlying flexibility during idea generation. Neuropsychologia 158:107890. doi: 10.1016/j.neuropsychologia.2021.107890
Mayseless, N., Aharon-Peretz, J., and Shamay-Tsoory, S. (2014). Unleashing creativity: the role of left temporoparietal regions in evaluating and inhibiting the generation of creative ideas. Neuropsychologia 64, 157–168. doi: 10.1016/j.neuropsychologia.2014.09.022
Mazza, A., Dal Monte, O., Schintu, S., Colombo, S., Michielli, N., Sarasso, P., et al. (2023). Beyond alpha-band: the neural correlate of creative thinking. Neuropsychologia 179:108446. doi: 10.1016/j.neuropsychologia.2022.108446
Mokyr, J. (1990). The lever of riches: Technological creativity and economic progress : New York and Oxford: Oxford University Press.
Montagna, F., and Candusso, A. (n.d.). Electroencephalogram: the definition of the assessment methodology for verbal responses and the analysis of brain waves in an idea creativity experiment. In webthesis.biblio.polito.it. Available at: https://webthesis.biblio.polito.it/13445/1/tesi.pdf
Montagna, F., and Laspia, A. (2018). A new approach to investigate the design process. webthesis.biblio.polito.it. Available at: https://webthesis.biblio.polito.it/10011/1/tesi.pdf
Nagai, Y., and Gero, J. (2012). Design creativity. J. Eng. Des. 23, 237–239. doi: 10.1080/09544828.2011.642495
Nair, N., Hegarty, J. P., Ferguson, B. J., Hecht, P. M., Tilley, M., Christ, S. E., et al. (2020). Effects of stress on functional connectivity during problem solving. NeuroImage 208:116407. doi: 10.1016/j.neuroimage.2019.116407
Nguyen, P., Nguyen, T. A., and Zeng, Y. (2018). Empirical approaches to quantifying effort, fatigue and concentration in the conceptual design process. Res. Eng. Des. 29, 393–409. doi: 10.1007/s00163-017-0273-4
Nguyen, P., Nguyen, T. A., and Zeng, Y. (2019). Segmentation of design protocol using EEG. Artif. Intell. Eng. Des. Anal. Manuf. 33, 11–23. doi: 10.1017/S0890060417000622
Nguyen, T. A., and Zeng, Y. (2010). Analysis of design activities using EEG signals. Vol. 5: 277–286. doi: 10.1115/DETC2010-28477
Nguyen, T. A., and Zeng, Y. (2012). A theoretical model of design creativity: nonlinear design dynamics and mental stress-creativity relation. J. Integr. Des. Process. Sci. 16, 65–88. doi: 10.3233/jid-2012-0007
Nguyen, T. A., and Zeng, Y. (2014a). A physiological study of relationship between designer’s mental effort and mental stress during conceptual design. Comput. Aided Des. 54, 3–18. doi: 10.1016/j.cad.2013.10.002
Nguyen, T. A., and Zeng, Y. (2014b). A preliminary study of EEG spectrogram of a single subject performing a creativity test. Proceedings of the 2014 international conference on innovative design and manufacturing (ICIDM), 16–21. doi: 10.1109/IDAM.2014.6912664
Nguyen, T. A., and Zeng, Y. (2017). Effects of stress and effort on self-rated reports in experimental study of design activities. J. Intell. Manuf. 28, 1609–1622. doi: 10.1007/s10845-016-1196-z
Oldfield, R. C. (1971). The assessment and analysis of handedness: the Edinburgh inventory. Neuropsychologia 9, 97–113. doi: 10.1016/0028-3932(71)90067-4
Pahl, G., Beitz, W., Feldhusen, J., and Grote, K.-H. (1988). Engineering design: A systematic approach . ( Vol. 3 ). London: Springer.
Peng, W. (2019). EEG preprocessing and denoising. In EEG signal processing and feature extraction. doi: 10.1007/978-981-13-9113-2_5
Perchtold-Stefan, C. M., Papousek, I., Rominger, C., Schertler, M., Weiss, E. M., and Fink, A. (2020). Humor comprehension and creative cognition: shared and distinct neurocognitive mechanisms as indicated by EEG alpha activity. NeuroImage 213:116695. doi: 10.1016/j.neuroimage.2020.116695
Petsche, H. (1996). Approaches to verbal, visual and musical creativity by EEG coherence analysis. Int. J. Psychophysiol. 24, 145–159. doi: 10.1016/S0167-8760(96)00050-5
Petsche, H., Kaplan, S., von Stein, A., and Filz, O. (1997). The possible meaning of the upper and lower alpha frequency ranges for cognitive and creative tasks. Int. J. Psychophysiol. 26, 77–97. doi: 10.1016/S0167-8760(97)00757-5
Pidgeon, L. M., Grealy, M., Duffy, A. H. B., Hay, L., McTeague, C., Vuletic, T., et al. (2016). Functional neuroimaging of visual creativity: a systematic review and meta-analysis. Brain Behavior 6:e00540. doi: 10.1002/brb3.540
Prent, N., and Smit, D. J. A. (2020). The dynamics of resting-state alpha oscillations predict individual differences in creativity. Neuropsychologia 142:107456. doi: 10.1016/j.neuropsychologia.2020.107456
Razumnikova, O. M. (2004). Gender differences in hemispheric organization during divergent thinking: an EEG investigation in human subjects. Neurosci. Lett. 362, 193–195. doi: 10.1016/j.neulet.2004.02.066
Razumnikova, O. M. (2022). Baseline measures of EEG power as correlates of the verbal and nonverbal components of creativity and intelligence. Neurosci. Behav. Physiol. 52, 124–134. doi: 10.1007/s11055-022-01214-6
Razumnikova, O. M., Volf, N. V., and Tarasova, I. V. (2009). Strategy and results: sex differences in electrographic correlates of verbal and figural creativity. Hum. Physiol. 35, 285–294. doi: 10.1134/S0362119709030049
Rogers, C. J., Tolmie, A., Massonnié, J., et al. (2023). Complex cognition and individual variability: a mixed methods study of the relationship between creativity and executive control. Front. Psychol. 14:1191893. doi: 10.3389/fpsyg.2023.1191893
Rominger, C., Benedek, M., Lebuda, I., Perchtold-Stefan, C. M., Schwerdtfeger, A. R., Papousek, I., et al. (2022a). Functional brain activation patterns of creative metacognitive monitoring. Neuropsychologia 177:108416. doi: 10.1016/j.neuropsychologia.2022.108416
Rominger, C., Gubler, D. A., Makowski, L. M., and Troche, S. J. (2022b). More creative ideas are associated with increased right posterior power and frontal-parietal/occipital coupling in the upper alpha band: a within-subjects study. Int. J. Psychophysiol. 181, 95–103. doi: 10.1016/j.ijpsycho.2022.08.012
Rominger, C., Papousek, I., Perchtold, C. M., Benedek, M., Weiss, E. M., Schwerdtfeger, A., et al. (2019). Creativity is associated with a characteristic U-shaped function of alpha power changes accompanied by an early increase in functional coupling. Cogn. Affect. Behav. Neurosci. 19, 1012–1021. doi: 10.3758/s13415-019-00699-y
Rominger, C., Papousek, I., Perchtold, C. M., Benedek, M., Weiss, E. M., Weber, B., et al. (2020). Functional coupling of brain networks during creative idea generation and elaboration in the figural domain. NeuroImage 207:116395. doi: 10.1016/j.neuroimage.2019.116395
Rominger, C., Papousek, I., Perchtold, C. M., Weber, B., Weiss, E. M., and Fink, A. (2018). The creative brain in the figural domain: distinct patterns of EEG alpha power during idea generation and idea elaboration. Neuropsychologia 118, 13–19. doi: 10.1016/j.neuropsychologia.2018.02.013
Runco, M. A., and Acar, S. (2012). Divergent thinking as an indicator of creative potential. Creat. Res. J. 24, 66–75. doi: 10.1080/10400419.2012.652929
Runco, M. A., and Jaeger, G. J. (2012). The standard definition of creativity. Creat. Res. J. 24, 92–96. doi: 10.1080/10400419.2012.650092
Runco, M. A., and Mraz, W. (1992). Scoring divergent thinking tests using total ideational output and a creativity index. Educ. Psychol. Meas. 52, 213–221. doi: 10.1177/001316449205200126
Sanei, S., and Chambers, J. A. (2013). EEG signal processing . John Wiley & Sons.
Schuler, A. L., Tik, M., Sladky, R., Luft, C. D. B., Hoffmann, A., Woletz, M., et al. (2019). Modulations in resting state networks of subcortical structures linked to creativity. NeuroImage 195, 311–319. doi: 10.1016/j.neuroimage.2019.03.017
Schwab, D., Benedek, M., Papousek, I., Weiss, E. M., and Fink, A. (2014). The time-course of EEG alpha power changes in creative ideation. Front. Hum. Neurosci. 8:310. doi: 10.3389/fnhum.2014.00310
Şekerci, Y., Kahraman, M. U., Özturan, Ö., Çelik, E., and Ayan, S. Ş. (2024). Neurocognitive responses to spatial design behaviors and tools among interior architecture students: a pilot study. Sci. Rep. 14:4454. doi: 10.1038/s41598-024-55182-7
Shemyakina, N. V., and Dan’ko, S. G. (2004). Influence of the emotional perception of a signal on the electroencephalographic correlates of creative activity. Hum. Physiol. 30, 145–151. doi: 10.1023/B:HUMP.0000021641.41105.86
Simon, H. A. (1996). The sciences of the artificial . 3rd Edn: MIT Press.
Simonton, D. K. (2000). Creativity: cognitive, personal, developmental, and social aspects. American psychologist, 55:151.
Simonton, D. K. (2012). Taking the U.S. patent office criteria seriously: a quantitative three-criterion creativity definition and its implications. Creat. Res. J. 24, 97–106. doi: 10.1080/10400419.2012.676974
Soroush, M. Z., Maghooli, K., Setarehdan, S. K., and Nasrabadi, A. M. (2018). A novel method of eeg-based emotion recognition using nonlinear features variability and Dempster–Shafer theory. Biomed. Eng.: Appl., Basis Commun. 30:1850026. doi: 10.4015/S1016237218500266
Srinivasan, N. (2007). Cognitive neuroscience of creativity: EEG based approaches. Methods 42, 109–116. doi: 10.1016/j.ymeth.2006.12.008
Steingrüber, H.-J., Lienert, G. A., and Gustav, A. (1971). Hand-Dominanz-Test : Verlag für Psychologie Available at: https://cir.nii.ac.jp/crid/1130282273024678144 .
Sternberg, R. J. (2020). What’s wrong with creativity testing? J. Creat. Behav. 54, 20–36. doi: 10.1002/jocb.237
Sternberg, R. J., and Lubart, T. I. (1998). “The concept of creativity: prospects and paradigms” in Handbook of creativity . ed. R. J. Sternberg (Cambridge: Cambridge University Press), 3–15.
Stevens, C. E., and Zabelina, D. L. (2020). Classifying creativity: applying machine learning techniques to divergent thinking EEG data. NeuroImage 219:116990. doi: 10.1016/j.neuroimage.2020.116990
Teplan, M. (2002). Fundamentals of EEG measurement. Available at: https://api.semanticscholar.org/CorpusID:17002960
Torrance, E. P. (1966). Torrance tests of creative thinking (TTCT). APA PsycTests . doi: 10.1037/t05532-000
Ueno, K., Takahashi, T., Takahashi, K., Mizukami, K., Tanaka, Y., and Wada, Y. (2015). Neurophysiological basis of creativity in healthy elderly people: a multiscale entropy approach. Clin. Neurophysiol. 126, 524–531. doi: 10.1016/j.clinph.2014.06.032
Vieira, S. L. D. S., Benedek, M., Gero, J. S., Cascini, G., and Li, S. (2021). Brain activity of industrial designers in constrained and open design: the effect of gender on frequency bands. Proceedings of the Design Society, 1(AUGUST), 571–580. doi: 10.1017/pds.2021.57
Vieira, S., Benedek, M., Gero, J., Li, S., and Cascini, G. (2022a). Brain activity in constrained and open design: the effect of gender on frequency bands. Artif. Intell. Eng. Des. Anal. Manuf. 36:e6. doi: 10.1017/S0890060421000202
Vieira, S., Benedek, M., Gero, J., Li, S., and Cascini, G. (2022b). Design spaces and EEG frequency band power in constrained and open design. Int. J. Des. Creat. Innov. 10, 193–221. doi: 10.1080/21650349.2022.2048697
Vieira, S. L. D. S., Gero, J. S., Delmoral, J., Gattol, V., Fernandes, C., and Fernandes, A. A. (2019). Comparing the design neurocognition of mechanical engineers and architects: a study of the effect of Designer’s domain. Proceedings of the Design Society: International Conference on Engineering Design, 1(1), 1853–1862. doi: 10.1017/dsi.2019.191
Vieira, S., Gero, J. S., Delmoral, J., Gattol, V., Fernandes, C., Parente, M., et al. (2020a). The neurophysiological activations of mechanical engineers and industrial designers while designing and problem-solving. Design Sci. 6:e26. doi: 10.1017/dsj.2020.26
Vieira, S., Gero, J. S., Delmoral, J., Li, S., Cascini, G., and Fernandes, A. (2020b). Brain activity in constrained and open design spaces: an EEG study. The Sixth International Conference on Design Creativity-ICDC2020. doi: 10.35199/ICDC.2020.09
Vieira, S., Gero, J. S., Delmoral, J., Parente, M., Fernandes, A. A., Gattol, V., et al. (2020c). “Industrial designers problem-solving and designing: An EEG study” in Research & Education in design: People & Processes & Products & philosophy . eds. R. Almendra and J. Ferreira 211–220. ( 1st ed. ) Lisbon, Portugal. CRC Press.
Vieira, S., Gero, J., Gattol, V., Delmoral, J., Li, S., Cascini, G., et al. (2020d). The neurophysiological activations of novice and experienced professionals when designing and problem-solving. Proceedings of the Design Society: DESIGN Conference, 1, 1569–1578. doi: 10.1017/dsd.2020.121
Volf, N. V., and Razumnikova, O. M. (1999). Sex differences in EEG coherence during a verbal memory task in normal adults. Int. J. Psychophysiol. 34, 113–122. doi: 10.1016/s0167-8760(99)00067-7
Volf, N. V., and Tarasova, I. V. (2010). The relationships between EEG θ and β oscillations and the level of creativity. Hum. Physiol. 36, 132–138. doi: 10.1134/S0362119710020027
Volf, N. V., Tarasova, I. V., and Razumnikova, O. M. (2010). Gender-related differences in changes in the coherence of cortical biopotentials during image-based creative thought: relationship with action efficacy. Neurosci. Behav. Physiol. 40, 793–799. doi: 10.1007/s11055-010-9328-y
Wallas, G. (1926). The art of thought . London: J. Cape.
Wang, Y., Gu, C., and Lu, J. (2019). Effects of creative personality on EEG alpha oscillation: based on the social and general creativity comparative study. J. Creat. Behav. 53, 246–258. doi: 10.1002/jocb.243
Wang, M., Hao, N., Ku, Y., Grabner, R. H., and Fink, A. (2017). Neural correlates of serial order effect in verbal divergent thinking. Neuropsychologia 99, 92–100. doi: 10.1016/j.neuropsychologia.2017.03.001
Wang, Y.-Y., Weng, T.-H., Tsai, I.-F., Kao, J.-Y., and Chang, Y.-S. (2023). Effects of virtual reality on creativity performance and perceived immersion: a study of brain waves. Br. J. Educ. Technol. 54, 581–602. doi: 10.1111/bjet.13264
Williams, A., Ostwald, M., and Askland, H. (2011). The relationship between creativity and design and its implication for design education. Des. Princ. Pract. 5, 57–71. doi: 10.18848/1833-1874/CGP/v05i01/38017
Xie, X. (2023). The cognitive process of creative design: a perspective of divergent thinking. Think. Skills Creat. 48:101266. doi: 10.1016/j.tsc.2023.101266
Yang, J., Quan, H., and Zeng, Y. (2022). Knowledge: the good, the bad, and the ways for designer creativity. J. Eng. Des. 33, 945–968. doi: 10.1080/09544828.2022.2161300
Yang, J., Yang, L., Quan, H., and Zeng, Y. (2021). Implementation barriers: a TASKS framework. J. Integr. Des. Process. Sci. 25, 134–147. doi: 10.3233/JID-210011
Yin, Y., Zuo, H., and Childs, P. R. N. (2023). An EEG-based method to decode cognitive factors in creative processes. AI EDAM. Available at: https://www.cambridge.org/core/journals/ai-edam/article/an-eegbased-method-to-decode-cognitive-factors-in-creative-processes/FD24164B3D2C4ABA3A57D9710E86EDD4
Yuan, H., Zotev, V., Phillips, R., Drevets, W. C., and Bodurka, J. (2012). Spatiotemporal dynamics of the brain at rest--exploring EEG microstates as electrophysiological signatures of BOLD resting state networks. NeuroImage 60, 2062–2072. doi: 10.1016/j.neuroimage.2012.02.031
Zangeneh Soroush, M., Tahvilian, P., Nasirpour, M. H., Maghooli, K., Sadeghniiat-Haghighi, K., Vahid Harandi, S., et al. (2022). EEG artifact removal using sub-space decomposition, nonlinear dynamics, stationary wavelet transform and machine learning algorithms. Front. Physiol. 13:910368. doi: 10.3389/fphys.2022.910368
Zangeneh Soroush, M., Zhao, M., Jia, W., and Zeng, Y. (2023a). Conceptual design exploration: EEG dataset in open-ended loosely controlled design experiments. Mendeley Data . doi: 10.17632/h4rf6wzjcr.1
Zangeneh Soroush, M., Zhao, M., Jia, W., and Zeng, Y. (2023b). Design creativity: EEG dataset in loosely controlled modified TTCT-F creativity experiments. Mendeley Data . doi: 10.17632/24yp3xp58b.1
Zangeneh Soroush, M., Zhao, M., Jia, W., and Zeng, Y. (2024). Loosely controlled experimental EEG datasets for higher-order cognitions in design and creativity tasks. Data Brief 52:109981. doi: 10.1016/j.dib.2023.109981
Zeng, Y. (2001). An axiomatic approach to the modeling of conceptual product design using set theory. Department of Mechanical and Manufacturing Engineering, 218.
Zeng, Y. (2002). Axiomatic theory of design modeling. J. Integr. Des. Process. Sci. 6, 1–28.
Zeng, Y. (2004). Environment-based formulation of design problem. J. Integr. Des. Process. Sci. 8, 45–63.
Zeng, Y. (2015). Environment-based design (EBD): a methodology for transdisciplinary design. J. Integr. Des. Process. Sci. 19, 5–24. doi: 10.3233/jid-2015-0004
Zeng, Y., and Cheng, G. D. (1991). On the logic of design. Des. Stud. 12, 137–141. doi: 10.1016/0142-694X(91)90022-O
Zeng, Y., and Gu, P. (1999). A science-based approach to product design theory part II: formulation of design requirements and products. Robot. Comput. Integr. Manuf. 15, 341–352. doi: 10.1016/S0736-5845(99)00029-0
Zeng, Y., Pardasani, A., Dickinson, J., Li, Z., Antunes, H., Gupta, V., et al. (2004). Mathematical foundation for modeling conceptual design Sketches1. J. Comput. Inf. Sci. Eng. 4, 150–159. doi: 10.1115/1.1683825
Zeng, Y., and Yao, S. (2009). Understanding design activities through computer simulation. Adv. Eng. Inform. 23, 294–308. doi: 10.1016/j.aei.2009.02.001
Zhang, W., Sjoerds, Z., and Hommel, B. (2020). Metacontrol of human creativity: the neurocognitive mechanisms of convergent and divergent thinking. NeuroImage 210:116572. doi: 10.1016/j.neuroimage.2020.116572
Zhao, M., Jia, W., Yang, D., Nguyen, P., Nguyen, T. A., and Zeng, Y. (2020). A tEEG framework for studying designer’s cognitive and affective states. Design Sci. 6:e29. doi: 10.1017/dsj.2020.28
Zhao, M., Yang, D., Liu, S., and Zeng, Y. (2018). Mental stress-performance model in emotional engineering . ed. S. Fukuda. (Cham: Springer). Vol. 6 .
Zhuang, K., Yang, W., Li, Y., Zhang, J., Chen, Q., Meng, J., et al. (2021). Connectome-based evidence for creative thinking as an emergent property of ordinary cognitive operations. NeuroImage 227:117632. doi: 10.1016/j.neuroimage.2020.117632
Keywords: design creativity, creativity, neurocognition, EEG, higher-order cognitive tasks, thematic analysis
Citation: Zangeneh Soroush M and Zeng Y (2024) EEG-based study of design creativity: a review on research design, experiments, and analysis. Front. Behav. Neurosci . 18:1331396. doi: 10.3389/fnbeh.2024.1331396
Received: 01 November 2023; Accepted: 07 May 2024; Published: 01 August 2024.
Reviewed by:
Copyright © 2024 Zangeneh Soroush and Zeng. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Yong Zeng, [email protected]
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.
An official website of the United States government
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .
Magdalene m. assimon.
1 University of North Carolina Kidney Center, Division of Nephrology and Hypertension, Department of Medicine, University of North Carolina School of Medicine, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina
Randomized controlled trials (RCTs) are considered the “gold standard” for establishing the safety and efficacy of medical treatments, such as drugs, devices, and procedures. Patients with kidney disease are often excluded from these studies ( 1 ), and it is well established that trial participants tend to be healthier than the broader kidney disease population ( 2 ). Furthermore, the number of nephrology-specific trials conducted continues to lag behind other subspecialties ( 3 ).
In the absence of RCT data, nephrology practitioners may look to population-specific observational evidence to guide therapy selection. Observational studies using real-world data ( e.g. , administrative claims and electronic healthcare record data) to evaluate the safety and effectiveness of medical treatments can provide highly generalizable and valuable information to clinicians ( 4 ). However, like nonrandomized prospective cohort studies, these studies may suffer from biases that limit their validity, such as confounding.
In this commentary, I describe what confounding is and provide a brief overview of common types of confounding that can arise in observational studies of medical treatments. I then highlight some common strategies for addressing confounding and discuss potential sources of residual confounding.
In an observational study, confounding occurs when a risk factor for the outcome also affects the exposure of interest, either directly or indirectly. The resultant bias can strengthen, weaken, or completely reverse the true exposure-outcome association. For a factor to be a confounder, it has to be associated with both the study exposure and the study outcome, and temporally precede the exposure ( i.e. , it cannot be an intermediary factor on the causal pathway between the exposure and the outcome) ( 5 ).
Confounding by indication ( 6 ) is one of the most common forms of bias present in observational studies evaluating the safety and effectiveness of medical treatments. It occurs when the clinical indication for treatment, such as the presence of a disease or disease severity, also affects the outcome of interest. Bias due to confounding by indication can make it appear that a treatment under investigation is associated with the occurrence of an outcome that it is supposed to prevent, especially in studies comparing the use of a medical treatment with nonuse. For example, confounding by indication would likely be present in an observational study assessing the association between aldosterone antagonist use versus nonuse and mortality in patients with heart failure. In such a study, heart failure severity is an important confounder. Clinicians are more likely to prescribe an aldosterone antagonist to patients with more severe heart failure, and more severe heart failure is also a risk factor for death. If heart failure severity is not adequately controlled for, it may appear that the use of an aldosterone antagonist increases the risk of death, which is contrary to existing evidence from placebo-controlled trials ( 7 ).
Confounding by frailty ( 8 ) can be another important source of bias in observational studies of medical treatments. This type of confounding occurs because frail patients, who are close to death, tend to have a lower likelihood of receiving preventative therapies than individuals who are healthier. When confounding by frailty is present, the preventative treatment being evaluated appears to be more beneficial than it actually is. For instance, confounding by frailty has been proposed as a potential explanation for the implausible 40%–60% mortality reduction seen in observational studies assessing influenza vaccine effectiveness in older adults ( 9 ). Compared to healthier patients, frailer patients with a poor short-term prognosis may be less likely to receive an influenza vaccine due to a perceived lack of benefit. In this scenario, frailty is a confounder because it associates with vaccine receipt and death.
Other types of confounding can arise when heathy behaviors are associated with both the medical exposure under study and the outcome of interest. For example, confounding by the healthy adherer effect ( 10 ) occurs because patients who adhere to treatments tend to have a higher likelihood of taking part in other beneficial healthy behaviors ( e.g. , exercising) than their nonadherent counterparts. When confounding by the healthy adherer effect is present, studies evaluating the effect of treatment adherence versus nonadherence on the occurrence of adverse clinical outcomes will often overestimate the beneficial effects of treatment adherence.
Finally, time-varying confounding occurs when the exposure of interest and potential confounders change across time. A common type of time-varying confounding that may be present in observational studies of medical treatments is “time-varying confounding affected by previous exposure.” ( 11 ) It arises when the clinical parameter indicating that a treatment change is necessary is independently related to the outcome of interest and is also affected by previous exposure to the treatment ( 12 ). For example, in a study assessing the association between erythropoietin-stimulating agent (ESA) dose and mortality in patients on hemodialysis, serum hemoglobin is a time-varying confounder that needs to be accounted for. Hemoglobin levels predict ESA dose, are influenced by prior ESA dose, and are independently associated with mortality (the outcome).
Confounding can be addressed in the design and analytic phases of observational studies. Common strategies are discussed below, and their advantages and disadvantages are summarized in Table 1 .
Advantages and disadvantages of common strategies used to address confounding
Method | Overview | Advantages | Disadvantages |
Restriction | Setting criteria for study inclusion | Easy to implement | Only removes or reduces confounding by the inclusion criteria Reduces sample size Cannot generalize findings to those excluded |
Matching | Creates matched sets of patients who have similar values of one or more confounders | Intuitive | Difficult to match on multiple confounders Only removes or reduces confounding by the matching factors Unmatched patients are excluded, reducing sample size, effect estimate precision, and generalizability |
Active comparator | Comparing the treatment of interest to an active comparator rather than treatment nonuse | Mitigates confounding by indication Clinically relevant head-to-head comparison of two or more treatments | Cannot be used when there is only one treatment option |
Multivariable adjustment | Potential confounders are included as covariates in regression models | Easy to implement in standard statistical software packages | Only controls for measured confounders The total number of confounders that can be included in regression models is contingent on the number of outcome events |
Propensity score matching | Each patient who received the treatment of interest is matched to one or more patients who received the comparator treatment with an equivalent propensity score, generating a matched cohort of treated and comparator patients that have similar baseline characteristics | Preferred in studies where there are relatively few outcome events compared with the number of potential confounders Ability to check if covariate balance between the treated and comparator groups was achieved in the matched cohort | Only controls for measured confounders Unmatched patients are excluded, reducing sample size, effect estimate precision, and generalizability |
Propensity score weighting | The propensity score is used to generate weights that are applied to the original study cohort to create a pseudo-population of treated and comparator patients that have similar baseline characteristics | Preferred in studies where there are relatively few outcome events compared with the number of potential confounders Ability to check if covariate balance between the treated and comparator groups was achieved in the weighed cohort | Only controls for measured confounders Less intuitive than propensity score matching |
G methods | Complex analytic methods that handle time-varying confounding in the setting of time-varying exposures | Appropriately handle time-varying confounding | Only controls for measured confounders Complex methods requiring advanced statistical expertise |
Restriction is a method than can be used for confounding control in the design phase. Similar to RCTs, restriction in an observational study involves setting criteria for study inclusion. By limiting the study to individuals who meet specific criteria, confounding by each respective inclusion criterion is either eliminated or reduced. For instance, in an observational study evaluating the risk of fracture associated with the use versus nonuse of benzodiazepines, age and sex are likely important confounders. Restricting the study cohort to males who are <65 years of age would eliminate confounding by sex and reduce confounding by age. Confounding by sex is eliminated because there is no variation in benzodiazepine use by sex—all benzodiazepine users and nonusers are male. Limiting the study cohort to individuals <65 years of age does not completely remove confounding by age, because benzodiazepine use patterns and fracture risk likely varies across the 18- to 64-year-old age group. Although restriction is an intuitive method that can be easily implemented, potential disadvantages include sample size reduction and decreased generalizability.
Another confounding control strategy that can be used in the design phase is matching. In a cohort study, matching involves selecting a comparator group that is matched to the treatment group on one or more confounders. Usually, individual-level matching is performed. Consider the previously mentioned observational study evaluating the benzodiazepine-fracture association. Because age and sex are important confounders, one or more benzodiazepine nonusers would be matched to a patient taking a benzodiazepine on the basis of age and sex. For example, a 63-year-old female not taking a benzodiazepine would be matched to a 63-year-old female taking a benzodiazepine. Although exact matching on the basis of age is ideal, it may not be possible. Broader age-based matching categories—such as matching on age within 5 years—can be used, but residual confounding by age may remain. In addition, it is important to keep in mind that identifying matched pairs of treated and comparator patients becomes more difficult as the number of matching factors increases.
Specific to observational studies evaluating medical treatments, a design strategy that can be used to minimize the effect of confounding by indication is using an active comparator rather than a nonuser comparator. The treatment of interest and the selected comparator should have the same clinical indication and therapeutic role, and in the case of medications, have the same mode of delivery ( 4 ). Furthermore, using an active comparator is the only logical comparator choice when irretractable confounding by indication is expected. Besides mitigating confounding by indication, head-to-head comparisons of two or more treatments with the same indication provide relevant information on comparative safety and effectiveness that can be used to inform the selection of one treatment over another in clinical practice.
There are several statistical approaches that can be used for confounding control in the analysis phase. Multivariable adjustment, which involves including potential confounders as covariates in regression models, is the most common analytic technique used. However, recently, propensity score methods, such as propensity score matching and propensity score weighting, have gained popularity ( 13 ).
In studies evaluating medical treatments, a propensity score is a patient’s predicted probability of receiving the treatment of interest versus a comparator, given their measured baseline characteristics. This summary score is estimated for each patient in the study cohort and is subsequently used for confounding control. In propensity score matching, each patient who received the treatment of interest is matched to one or more patients who received the comparator with an equivalent propensity score. This results in the generation of a matched cohort of treated and comparator patients that have similar baseline characteristics. In propensity score weighing, the propensity score is used to generate weights that are applied to the original study cohort to create a pseudo-population of treated and comparator patients that have similar baseline characteristics ( 14 ). The resultant matched and weighted cohorts can be used to estimate the treatment-outcome association, where the influence of measured baseline confounding is minimized. Propensity score methods and multivariable adjustment typically yield similar adjusted estimates of the treatment-outcome association ( 13 ). However, because a propensity score combines multiple covariates into a single summary score, these methods are preferred when the exposure of interest is common and outcome of interest is rare, a setting where multivariable outcome models are susceptible to overfitting. Readers interested in learning more about propensity score methods can refer to the tutorial provided by Fu et al. ( 15 ).
G methods, such as inverse probability–weighted marginal structural models, are complex analytic methods that appropriately handle time-varying confounding in the setting of time-varying exposures. A thorough description of G methods is beyond the scope of this commentary and can be found elsewhere ( 11 ). However, it is important to recognize that the use of these methods is increasing in the field of nephrology.
Despite the use of study designs and analytic strategies that aim to eliminate confounding, residual confounding may persist. Common reasons why residual confounding may be present are: ( 1 ) information on a confounder is not available; ( 2 ) the version of the confounding variable present in the data source is an imperfect surrogate or is misclassified; and ( 3 ) continuous confounders are parameterized as categoric variables, especially when overly broad categories are used ( 16 ).
Observational studies using real-world data can provide clinically actionable information on the potential benefits and harms of medical treatments in populations excluded from RCTs, such as patients with kidney disease. Confounding is a common source of bias threatening the validity of these studies. Thus, it is important to be aware of the types confounding that may be present and understand the advantages and disadvantages of common strategies used for confounding control.
M.M. Assimon reports receiving honoraria from the American Society of Nephrology and the International Society of Nephrology, and investigator-initiated research funding from the Renal Research Institute, a subsidiary of Fresenius Medical Care, North America in the last 3 years.
M.M. Assimon is supported by National Heart, Lung, and Blood Institute grant R01 HL152034.
M.M. Assimon wrote the original draft and reviewed and edited the manuscript.
Did we really take the red pill, the blue pill.
Could we be trapped inside a simulated reality, rather than the physical universe we usually assume?
It's a tantalizing theory, long theorized by philosophers and popularized by the 1999 blockbuster "The Matrix." What if there was a way to find out once and for all if we're living inside a computer?
A former NASA physicist named Thomas Campbell has taken it upon himself to do just that. He devised several experiments, as detailed in a 2017 paper published in the journal The International Journal of Quantum Foundations , designed to detect if something is rendering the world around us like a video game.
Now, scientists at the California State Polytechnic University (CalPoly) have gotten started on the first experiment, putting Campbell's far-fetched hypothesis to the test.
And Campbell has set up an entire non-profit called Center for the Unification of Science and Consciousness (CUSAC) to fund these endeavors. The experiments are "expected to provide strong scientific evidence that we live in a computer-simulated virtual reality," according to a press release by the group.
Needless to say, it's an eyebrow-raising project. As always, extraordinary claims will require extraordinary evidence — but regardless, it's a fun idea.
Campbell's experiments include a new spin on the double-slit experiment, a physics demonstration designed to show how light and matter can act like both waves and particles.
Campbell believes that by removing the observer from these experiments, the actual recorded information never existed in the first place. That's instead of current quantum physics suggesting the existence of entanglement that links particles across a distance.
In simple terms, without a player, the universe around them doesn't exist, much like a video game — proof, in Campbell's thinking , that the universe is exclusively "participatory."
Campbell isn't the first to explore a simulation hypothesis. Back in 2003, Swedish philosopher Nick Bostrom published a paper titled " Are You Living in a Computer Simulation? "
Basically, his idea was that if we progress far enough technologically, we'll probably end up running a simulation of our ancestors. Give those simulated ancestors enough time, and they'll end up simulating their own ancestors. Eventually, most minds in existence will be inside layers of simulations — meaning that we probably are too.
Campbell's hypothesis takes a different tack than Bostrom's "ancestor simulation," arguing that our "consciousness is not a product of the simulation — it is fundamental to reality," in CUSAC's press release.
If he were to be successful in his bid to prove that humanity is trapped in a virtual reality — an endeavor that would subvert our basic understanding of the world around us — it could have major implications.
Campbell argued that the five experiments could "challenge the conventional understanding of reality and uncover profound connections between consciousness and the cosmos."
More on the simulation hypothesis: Famous Hacker Thinks We're Living in Simulation, Wants to Escape
Share This Article
You are accessing a machine-readable page. In order to be human-readable, please install an RSS reader.
All articles published by MDPI are made immediately available worldwide under an open access license. No special permission is required to reuse all or part of the article published by MDPI, including figures and tables. For articles published under an open access Creative Common CC BY license, any part of the article may be reused without permission provided that the original article is clearly cited. For more information, please refer to https://www.mdpi.com/openaccess .
Feature papers represent the most advanced research with significant potential for high impact in the field. A Feature Paper should be a substantial original Article that involves several techniques or approaches, provides an outlook for future research directions and describes possible research applications.
Feature papers are submitted upon individual invitation or recommendation by the scientific editors and must receive positive feedback from the reviewers.
Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.
Original Submission Date Received: .
Find support for a specific problem in the support section of our website.
Please let us know what you think of our products and services.
Visit our dedicated information section to learn more about MDPI.
Experimental study on improving the performance of cement mortar with self-synthesized viscosity-reducing polycarboxylic acid superplasticizer.
2. experiment, 2.1. materials and sample preparation, 2.1.1. cement, 2.1.2. self-synthesized vrpce, 2.1.3. cement paste, 2.1.4. cement mortar, 2.2. methods, 2.2.3. zeta potential, 2.2.4. cement particle size, 2.2.9. fluidity, 2.2.10. compressive strength, 2.2.11. shrinkage, 2.2.12. creep, 3.1. molecular weight and molecular weight distribution, 3.2. adsorption, 3.3. dispersion, 3.4. particle size, 3.5. compositions, 3.9. fluidity, 3.10. compressive strength, 3.11. shrinkage, 3.12. creep, 4. discussion, 4.1. molecular weight and molecular weight distribution, 4.2. adsorption, 4.3. dispersion, 4.4. particle size, 4.5. compositions, 4.9. fluidity, 4.10. compressive strength, 4.11. shrinkage, 4.12. creep, 5. conclusions.
Data availability statement, conflicts of interest.
Click here to enlarge figure
Composition | SiO | Al O | Fe O | CaO | MgO | SO | K O | Na O | LOI |
---|---|---|---|---|---|---|---|---|---|
wt.% | 21.92 | 4.46 | 3.43 | 64.55 | 1.98 | 0.34 | 0.65 | 0.46 | 2.21 |
Type | HPEG3000 | AA | PFM | MDMA |
---|---|---|---|---|
VRPCE1 | 87 | 9 | 1 | 3 |
VRPCE2 | 81 | 15 | 1 | 3 |
VRPCE3 | 79 | 17 | 1 | 3 |
Code | VRPCE Type | Water/Cement | VRPCE Content (%) |
---|---|---|---|
C0 | - | 0.25 | 0 |
C1 | VRPCE1 | 0.25 | 0.3 |
C2 | VRPCE2 | 0.25 | 0.3 |
C3 | VRPCE3 | 0.25 | 0.3 |
Code | VRPCE Type | Water/Cement | Cement/Sand | VRPCE Content (%) |
---|---|---|---|---|
M0 | - | 0.25 | 0.5 | 0 |
M1 | VRPCE1 | 0.25 | 0.5 | 0.3 |
M2 | VRPCE2 | 0.25 | 0.5 | 0.3 |
M3 | VRPCE3 | 0.25 | 0.5 | 0.3 |
Code | M | M | PDI |
---|---|---|---|
VRPCE1 | 40,162 | 82,186 | 2.05 |
VRPCE2 | 37,510 | 80,598 | 2.15 |
VRPCE3 | 34,053 | 71,985 | 2.11 |
Code | C0 | C1 | C2 | C3 |
---|---|---|---|---|
Porosity (%) | 7.02 | 7.04 | 9.12 | 11.01 |
The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
Wang, Z.; Shen, Y.; Li, Y.; Tian, Y. Experimental Study on Improving the Performance of Cement Mortar with Self-Synthesized Viscosity-Reducing Polycarboxylic Acid Superplasticizer. Buildings 2024 , 14 , 2418. https://doi.org/10.3390/buildings14082418
Wang Z, Shen Y, Li Y, Tian Y. Experimental Study on Improving the Performance of Cement Mortar with Self-Synthesized Viscosity-Reducing Polycarboxylic Acid Superplasticizer. Buildings . 2024; 14(8):2418. https://doi.org/10.3390/buildings14082418
Wang, Zigeng, Yonghao Shen, Yue Li, and Yuan Tian. 2024. "Experimental Study on Improving the Performance of Cement Mortar with Self-Synthesized Viscosity-Reducing Polycarboxylic Acid Superplasticizer" Buildings 14, no. 8: 2418. https://doi.org/10.3390/buildings14082418
Article access statistics, further information, mdpi initiatives, follow mdpi.
Subscribe to receive issue release notifications and newsletters from MDPI journals
IMAGES
VIDEO
COMMENTS
What is confounding? Confounding is often referred to as a "mixing of effects" 1, 2 wherein the effects of the exposure under study on a given outcome are mixed in with the effects of an additional factor (or set of factors) resulting in a distortion of the true relationship. In a clinical trial, this can happen when the distribution of a known prognostic factor differs between groups ...
Confounding variables (a.k.a. confounders or confounding factors) are a type of extraneous variable that are related to a study's independent and dependent variables. A variable must meet two conditions to be a confounder: It must be correlated with the independent variable. This may be a causal relationship, but it does not have to be.
There are various ways to exclude or control confounding variables including Randomization, Restriction and Matching. But all these methods are applicable at the time of study design. When experimental designs are premature, impractical, or impossible, researchers must rely on statistical methods to adjust for potentially confounding effects.
Case-control studies: Case-control studies assign confounders to both groups (the experimental group and the control group) equally. Example. ... Confounding bias is a bias that is the result of having confounding variables in your study design. If the observed association overestimates the effect of the independent variable on the dependent ...
Confounding, sometimes referred to as confounding bias, is mostly described as a 'mixing' or 'blurring' of effects. 1 It occurs when an investigator tries to determine the effect of an exposure on the occurrence of a disease (or other outcome), but then actually measures the effect of another factor, a confounding variable. As most medical studies attempt to investigate disease ...
The researcher sets up an experimental condition and a control condition so that they are as alike as possible in every causally relevant way except for one experimental factor. If there is a difference in effect, then logic compels us to accept that the experimental factor is a cause of the effect. ... In the CONFOUND study, causal conclusions ...
8.3 - Confounding. Confounding is a situation in which the effect or association between an exposure and outcome is distorted by the presence of another variable. Positive confounding (when the observed association is biased away from the null) and negative confounding (when the observed association is biased toward the null) both occur.
Confounding Variable Definition. In studies examining possible causal links, a confounding variable is an unaccounted factor that impacts both the potential cause and effect and can distort the results. Recognizing and addressing these variables in your experimental design is crucial for producing valid findings.
In other types of studies you can address confounding through restriction or matching. Restriction means only studying people who are similar in terms of a confounding variable - for example, if you think age is a confounding variable you might only choose to study people older than 65. (This would obviously limit the applicability of your ...
Confounding variable: A variable that is not included in an experiment, yet affects the relationship between the two variables in an experiment. This type of variable can confound the results of an experiment and lead to unreliable findings. For example, suppose a researcher collects data on ice cream sales and shark attacks and finds that the ...
Abstract. Confounding is an important source of bias, but it is often misunderstood. We consider how confounding occurs and how to address confounding using examples. Study results are confounded when the effect of the exposure on the outcome, mixes with the effects of other risk and protective factors for the outcome.
1. IQ and Reading Ability. A study could find a positive correlation between children's IQ and reading ability. However, the socioeconomic status of the families could be a confounding variable, as children from wealthier families could have more access to books and educational resources. 2.
3.5 - Bias, Confounding and Effect Modification. Consider the figure below. If the true value is the center of the target, the measured responses in the first instance may be considered reliable, precise or as having negligible random error, but all the responses missed the true value by a wide margin. A biased estimate has been obtained.
The best way to control for confounding variables is to conduct "true experimental research," which means researchers experimentally manipulate a variable that they think causes a certain outcome. ... Chu L, et al. Evaluation of confounding in epidemiologic studies assessing alcohol consumption on the risk of ischemic heart disease. BMC Med ...
A confound variable in a psychological experiment is called an experimental confound. An example of a situation in which a confound is likely to lead to wrong conclusions drawn from an experiment is a working memory training study. When a group of participants of the study train with working memory tasks, they may become better in a subsequent ...
Experimental studies are less susceptible to confounding because the investigator determines who is exposed and who is unexposed. In particular, if exposure is allocated randomly and the number of groups or individuals randomised is large then even unrecognised confounding effects become statistically unlikely.
This section discusses two major challenges of observational studies—selection bias and confounding—and approaches that can be taken to minimize these problems. ... Crude, unadjusted results of non-experimental studies may lead to invalid inference regarding the effects of the intervention. Confounding can cause over- or under-estimation of ...
These criteria are then applied to assess level of confounding control in each component study. Confounding control across the studies is summarized using a visual confounder matrix. ... Duvendack M, et al. Quasi-experimental study designs series-paper 10: synthesizing evidence for effects collected from quasi-experimental studies presents ...
Use case controls or matching. If you suspect confounding variables, match the test subject and control as much as possible. In human experiments, you might select subjects of the same age, sex, ethnicity, education, diet, etc. For animal and plant studies, you'd use pure lines. In chemical studies, use samples from the same supplier and batch.
More disturbingly, according to Richard Feynmann, the studies reporting these confounding factors were not picked up by researchers at the time. As a result we simply don't know if any animal maze studies carried out around this time have any validity whatsoever. That's decades worth of high-end research at the finest universities around the ...
This is the simplest full factorial experiment, having two independent variables (card size and print size), each with two levels (small and large). For this 2×2 factorial experiment, there are four experimental conditions: Large cards, large print. Large cards, small print. Small cards, large print. Small cards, small print.
3 66 system to design an analysis accounting for all confounders to enable credible causal inferences 67 from observational data is really hard, even for the most experienced researchers. As a result, causal inference from observational data is often dismissed as impossible,68 prompting the saying 69 "correlation is not causation." Dealing with the problems created by not controlling for
These studies can be roughly classified into the following distinct categories based on their proposed experiments and EEG analysis methods (Pidgeon et al., 2016; Jia, 2021): (1) visual creativity versus baseline rest/fixation, (2) visual creativity versus non-rest control task(s), (3) individuals of high versus low creativity, (4) generation ...
published experimental and observational studies, please read the assigned papers . before class. Guidelines to assist you in the preparation of the study designs will be ... Confounding & interaction JH 15-16 Th 19 Clinical trials JH 17-18 T 24 Journal Club: clinical trials*** Students, JH, DK 19-20 Th 26 12.50-1.40 Review & study assignment ...
Confounding by Indication and Examples of Other Types of Confounding. Confounding by indication is one of the most common forms of bias present in observational studies evaluating the safety and effectiveness of medical treatments.It occurs when the clinical indication for treatment, such as the presence of a disease or disease severity, also affects the outcome of interest.
The experiments are "expected to provide strong scientific evidence that we live in a computer-simulated virtual reality," according to a press release by the group.
In this study, a viscosity-reducing polycarboxylic acid superplasticizer (VRPCE) was synthesized using methylallyl polyoxyethylene ether (HPEG), acrylic acid (AA), and maltodextrin maleic acid monoester (MDMA) as the main raw materials. The influences of the VRPCE on the microscopic properties of cement paste were studied by gel permeation chromatography (GPC), total organic carbon test (TOC ...