Search This Blog

Polyhedrika.

Lean Six Sigma Process Improvement

Design of Experiments (DOE) with Excel

how to do design of experiments in excel

Design of Experiments (DOE) is a very useful process improvement methodology. 

Microsoft Excel has some powerful data analysis tools, which I have, successfully, used for DOE.

When it comes to analyzing cause and effect, we want to know what process factors affect our outputs.

  Correlation and Regression could give us an indication of the main factors, and the extent they affect the outputs.

In some cases, there may be interactions among the different factors, or the mathematical relation between factors and outputs may not be linear.

In these cases, it is useful to run a systematic set of experiments, to test all possible combinations of factors, to relate them to the outputs.

how to do design of experiments in excel

Factorial Experiment Example

We want to minimize process loss, and after some brainstorming among the process specialists, we concluded that 5 factors may affect loss. Based on current factor levels, we have selected the following levels to experiment with:   

how to do design of experiments in excel

Download this Excel file DOE_with_Excel.xlsm from OneDrive

Factorial Designs

Full factorial designs, are constructed simply counting in binary, to obtain all the different combinations of 0 and 1, for all factors. In our designs, we just replace 0 with -1, to meet the balance property.

See sheet  Designs:

how to do design of experiments in excel

1/2 Fraction Design

See sheet Half fraction :

With our 5 factors, to run a full factorial set of experiments, we would need 2 ^ 5 = 32 experiments. 

This could involve time and money, before we are sure that all factors really affect our process. 

Therefore we might start with a subset, of the full factorial, as a detection experiment. 

A 1/2 fraction will involve 1/2 the number of experiments: 16.

We can build a 1/2 fraction, 5 factor design, from a 4 factor full factorial:

how to do design of experiments in excel

We do this, by replacing the 4 factor interaction (which is very unlikely to happen) with the 5th factor (Flow). We construct, therefore, the Flow column, multiplying the other 4 columns.

Run the Experiments

how to do design of experiments in excel

Experiments Simulation

how to do design of experiments in excel

We have added 2 additional experiments, with the values called central points (all zeros) which will correspond (in uncoded values) to the average value of the high and low levels, in each factor.

We do this, to make a more robust design, without going all the way to the full 32 experiments.

The simulator has provided the process loss estimation outputs, for each experiment.

In the simulator, every time we press F9, a new set of experiments is run, giving (as in real life) different results. 

Factor Interactions

We want to be able to estimate the influence of each individual factor in the result, but also the interactions among them, so before we analyze the results we will build a column for each interaction among 2 factors. 

If we wanted to include 3 factor interactions we would also add them to this matrix.

Sheet Detection:

how to do design of experiments in excel

You obtain each interaction column, just by multiplying the corresponding factor columns. Just enter the formula in the first line, and replicate all the way down.

Regression with Excel Data Analysis

how to do design of experiments in excel

And the result will be:

how to do design of experiments in excel

We notice that R^2 = 0.90 which validates our mathematical model.

The Probability column of the coefficients (p) tells us which factors and interactions are significant: those below 0.05.

Factors Temp and Press have a p value above 0.05 but their interaction is below (0.02) therefore we must keep them.

We run the Regression again keeping only the factors and interactions not removed (sheet Reduced model ):

how to do design of experiments in excel

Process Optimization

We now want to calculate, with this mathematical model of our process, which are the pH, Temp and Pres values which will minimize Process Loss.

In order to do this we will use Excel Solver.  (See  Theory of Constraints  )

how to do design of experiments in excel

The result is that the minimum loss of 7.64 can be obtained with 

Confirmation Experiments

how to do design of experiments in excel

Graphical Representation

how to do design of experiments in excel

Entering a value for pH we can see the Loss for each Press-Temp combination.

With the color code we clearly identify the minimum value in dark green: 7.6 which corresponds to Press = -1 and Temp = 1 as calculated by Solver.

3D Representation:

how to do design of experiments in excel

We can show the same data, with a 3D representation, in Excel

In spite of the curvature of this shape, the behavior of each factor is linear. 

The twisting of the shape, is due to the interaction, between Pressure and Temperature.

Residuals Analysis to Validate the Model

how to do design of experiments in excel

Test for Normality of the Residuals

how to do design of experiments in excel

If the P value, for this normality check, was below 0.05, this would indicate lack of normality. 

Since it is 0.983, well above 0.05, it passes the test.

Test for Stability of the Residuals

how to do design of experiments in excel

See    SPC Analysis

Conclusions

  • Design of Experiments (DOE) is a powerful methodology, for process improvement.
  • It enables the identification of critical process factors, based on data, rather than impressions.
  • We can estimate the optimal values of these critical factors, to optimize the process.
  • Excel provides some useful Data Analysis tools, to achieve this.
  • Some processes outputs, don't have a linear relation with the critical factors. In this case we will need a more sophisticated formula such as RSM:    Response Surface DOE with Excel
  • Another non-linear example:    The Catapult exercise

Post a Comment

Popular posts from this blog, six sigma virtual catapult with excel.

Image

Response Surface Design Of Experiments with Excel

Image

logo

Best Excel Tutorial

The largest Excel knowledge base ✅ The best place to learn Excel online ❤️

Design of Experiments (DOE) in Excel

Design of Experiments (DOE) is a valuable methodology for optimizing processes and products through systematic experimentation. While conducting complex DOE analyses may require specialized software, Excel can be a useful tool for basic DOE tasks and preliminary analysis.

Here’s a guide on how to use Excel for Design of Experiments:

Table of Contents

Define Your Experiment

Clearly define the purpose and goals of your experiment. Determine the factors (variables) you want to study, the levels of those factors, and the response variable you’re trying to optimize or understand.

Create an Experimental Design

Excel doesn’t have built-in tools for generating full experimental designs, but you can create basic designs manually. Use Excel’s worksheet to organize your experiment by listing factors, levels, and experimental runs.

Conduct Experiments

Perform the experiments according to your defined design. Record the data in Excel, with each row representing an experimental run and each column corresponding to a factor or the response variable.

Perform Preliminary Analysis

Excel provides various statistical functions that can help you analyze your experimental data. You can calculate means, standard deviations, and other descriptive statistics to gain insights into your data’s behavior.

Create Visualizations

Use Excel’s charting capabilities to create visual representations of your data. Visualizations like scatter plots, histograms, and line charts can help you identify trends and patterns.

Regression Analysis

Excel’s regression analysis tools can be helpful in determining the relationship between your factors and the response variable. You can perform simple linear regression or multiple regression to model these relationships.

Optimize Your Process

If your goal is optimization, you can use Excel’s Solver add-in to find the optimal settings for your factors that maximize or minimize the response variable. Define your objective function and constraints within Solver to perform optimization.

Interpret the Results

Carefully interpret the results of your experiments and analyses. Determine which factors have the most significant impact on the response variable and make informed decisions based on your findings.

Related posts:

select excel addins go

SigmaXL

  • Testimonials
  • Quality Resources
  • Training Partners
  • Academic Partners
  • Partner's Login
  • DiscoverSim
  • SigmaXL iPad Templates
  • Current Versions
  • Register Product
  • SigmaXL Trial
  • DiscoverSim Trial
  • Additional Resources
  • SigmaXL Pricing
  • DiscoverSim Pricing
  • Request A Quote
  • Order SigmaXL
  • Order DiscoverSim
  • Order SigmaXL iPad Templates
  • Payment Options
  • International Distributors
  • System Requirements
  • SigmaXL Guide
  • DiscoverSim Guide
  • PDF Downloads
  • Search SigmaXL
  • Product Activation FAQs
  • Registration Portal

Overview of Basic Design of Experiments (DOE) Templates

  • Two-Factor, 4-Run, Full-Factorial
  • Three-Factor, 4-Run, Half-Fraction, Res III
  • Three-Factor, 8-Run, Full-Factorial
  • Four-Factor, 8-Run, Half-Fraction, Res IV
  • Four-Factor, 16-Run, Full-Factorial
  • Five-Factor, 8-Run, Quarter-Fraction, Res III
  • Five-Factor, 16-Run, Half-Fraction, Res V

Three Factor Full Factorial Example Using DOE Template

Pareto Chart Coefficients

Multiple Regression and Excel Solver (Advanced Topics):

  • In order to run Multiple Regression analysis we will need to unprotect the worksheet. Click SigmaXL > Help > Unprotect Worksheet .
  • In the Coded Design Matrix , highlight columns A to ABC, and the calculated responses as shown:

Coded Design Matrix

  • Select Average (Y) , click Numeric Response (Y) >> ; holding the CTRL key, select B , C , and BC ; click Continuous Predictors (X) >> as shown:

Multiple Regression

Define, Measure, Analyze, Improve, Control

Lean Six Sigma Software Excel Add-in

Simulate, Optimize, Realize

Lean Six Sigma Software Excel Add-in

Our CTO and Co-Founder, John Noguera, regularly hosts free Web Demos featuring SigmaXL and DiscoverSim Click here to view some now!

Phone:  1.888.SigmaXL (744.6295)

Support:  [email protected]

Sales:  [email protected]

Information:  [email protected]

SigmaXL Facebook Page

how to do design of experiments in excel

Most Practical DOE Explained (with Template)

Published: February 26, 2010 by Kim Niles

how to do design of experiments in excel

For purposes of learning, using, or teaching design of experiments (DOE), one can argue that an eight run array is the most practical and universally applicable array that can be chosen. There are several forms of and names given to the various types of these eight run arrays (e.g., 2^3 Full Factorial, Taguchi L8, 2^4-1 Half Fraction, Plackett-Burman 8-run, etc.), but they are all very similar.

A free Microsoft Excel spreadsheet with a 2^3 Full Factorial array showing the mathematical calculations accompanies this article (click below to download it). Generic steps for using the spreadsheet, precautions, and additional advice are included below.

Click here to download template

Viewing Tip : Usually, you can click on the icon link above to view the document in a new window — it may open within your browser using the application (in this case either Word or Excel). If you are having difficulty, try right clicking the link and selecting “Save Target As…” or “Save As…” to save it to your computer harddrive.

Generic Steps For Using The Attached Spreadsheet

There are many different articles in the literature that outline steps that should be taken to complete a DOE. The following steps are recommended for using the accompanying spreadsheet:

  • Determine the acceptance criteriayou need (i.e., acceptable alpha error or confidence level for determining what you will accept as passing criteria). This is typically alpha=.05 or 95 percen confidence for solid decisions; see additional advice below.
  • Pick 2-3 factorsto be tested and assign them to columns A, B and C as applicable (advise using the key provided).
  • Pick 2 different test levelsfor each of the factors you picked (i.e., low/high, on/off, etc.).
  • Determine the number of samples per run(room for 1-8 only; affects normality and effect accuracy, not confidence).
  • Randomizethe order to the extent possible.
  • Run the experimentand collect data. Keep track of everything you think could be important (i.e., people, material lot numbers, etc.). Keep all other possible control factors as constant as possible as these may affect the validity of the conclusions.
  • Analyze the databy entering the data into the yellow boxes of the spreadsheet and reading the results. A review of the ANOVA table will show you those effects that meet the acceptance criteria established in step number one. If the alpha value in the table is greater than the acceptance criteria, accept the result; if it is less, reject the result. Similarly, the higher the confidence, the higher the probability that that factor is statistically different from the others. Signal to noise measurements are helpful to use when selecting factors for re-testing in subsequent experiments.
  • Confirm your resultsby performing a separate test, another DOE, or in some other way before fully accepting any results. You may want to more closely define results that are close to your acceptance criteria by retesting the factor using larger differences between the levels.

How DOEs Work

Note that using the eight run array, we have four runs being tested with each factor at high levels and four without being at a high level. We have the equivalent of eight data points comparing the effects of each high level (4 high + 4 not high = 8 relative to high) and vice versa for each factor and the interactions between the three factors. Therefore, using this balanced multifactor DOE array, our eight run test becomes the statistical equivalent of a 96 run, one-factor-at-a-time (OFAT) test [(8 Ahigh)+(8 Alow)+(8 Bhigh)+(8 Blow)+(8 Chigh)+(8 Clow)+(8 ABhigh)+(8 ABlow)+(8 AChigh)+(8 AClow)+(8 BChigh)+(8 BClow)]. Other advantages to using DOE include the ability to use statistical software to make predictions about any combination of the factors in between and slightly beyond the different levels, and generating various types of informative two- and three-dimensional plots. Therefore, DOEs produce orders of magnitude more information than OFAT tests at the same or lower test costs.

DOEs don’t directly compare results against a control or standard. They evaluate all effects and interactions and determine if there are statistically significant differences among them. They also calculate statistical confidence levels for each measurement. A large effect might result but if the statistical confidence for that measurement is low then that effect is not believed. On the other hand, a small measured effect with high confidence tells us that the effect really isn’t important.

Why This Array?

Selecting a test array requires balancing your test objectives, test conditions, test strategy, and resources available. Therefore, it is usually more advantageous to run several DOEs testing only a few factors at once than one large DOE. Comparing the statistical power of this array (inherent ability to resolve differences between test factors; 1-b) with the cost of performing the experiment (number of runs needed) also shows how this array is advantageous since it requires only eight runs and yields successful results in most situations.

Picking Acceptance Criteria

Acceptable confidence depends upon your needs. If health could be affected, then you may want more than 99 percent confidence before making a decision. This author’s rule of thumb is that 51-80 percent is considered low but in some cases is worth considering (see precautions below). A confidence level of 80-90 percent is considered moderate with the results likely to be at least partially correct and a confidence level greater than 90 percent is considered high with the results usually considered very likely to be correct.

Calculating Samples Per Run

Since statistical confidence doesn’t increase with additional samples per run (replications are needed to have that effect), it is important to remember that additional samples per run are only needed when concerns of non-normal data exist (in accordance with the central limit theorem) and or to improve measured effect accuracy (significance of changes between test levels). Since test costs and/or knowing the statistical confidence in our effects is usually more important than statistical significance and normality effects, running two to three samples per run is usually ideal.

Precautions

Since this array has less power than others available, we need to remember that when optimizing a process that isn’t critical to human safety, using test results with a low confidence level can often be much better than not knowing which way to go with machine settings, etc. Assuming all the error in one’s experiment is evenly distributed (random distribution of error), a confidence level of 60 percent (measured to be true via DOE), for example, might seem horrible but really means the equivalent of 80 percent, since the 40 percent that we are unsure of could go either way [60 percent + (40 percent / 2) = 80 percent].

The accompanying spreadsheet cannot easily be changed. It should be used while training others (shows the math), or when you want to perform a quick experiment and are away from statistical software. It can’t be replicated or you can’t add center points in its current form (center points increase statistical confidence by improving measurements of error in the experiment).

About the Author

' src=

  • 800 -274-2874
  • [email protected]

DOE (Design of Experiments) Help

Home » SPC for Excel Help » Analysis Help » DOE (Design of Experiments) Help

The experimental design module for SPC for Excel contains the following two-level experimental designs:

  • Two level full factorial designs (up to 7 factors)
  • Two level fractional factorial designs (29 designs to choose from for up to 15 factors)
  • Two level Plackett-Burman designs (up to 27 factors)

how to do design of experiments in excel

To access measurement systems analysis, select “DOE” in the “Analysis” panel in the SPC for Excel ribbon.

The following form is displayed.

how to do design of experiments in excel

The first page lists the designs that are available in the SPC for Excel software . This page is used to setup the designs.

Select this link for more information on setting up the experimental design.

Select this link for more information on analyzing the experimental design.

SPC Knowledgebase Newsletter and Videos

QI Macros for Excel

Lean Six Sigma & SPC Excel Add-in

  • Questions? Contact Us
  • 888-468-1537

Key Tools in QI Macros

Control charts, pareto charts, fishbone diagrams, gage r&r studies.

  • Data Mining Tools

Statistical Tools

Smart wizards.

Knowledge Base - All Tools

  • Free 30-Day Trial
  • Powerful SPC Software for Excel
  • SPC - Smart Performance Charts
  • Who Uses QI Macros?
  • What Do Our Customers Say?
  • QI Macros SPC Software Reviews
  • SPC Software Comparison
  • Control Chart
  • Histogram with Cp Cpk
  • Pareto Chart
  • Automated Fishbone Diagram
  • Gage R&R MSA
  • Statistical Analysis - Hypothesis Testing
  • Chart and Stat Wizards
  • Lean Six Sigma Excel Templates
  • Technical Support - PC
  • Technical Support - Mac
  • QI Macros FAQs
  • Upgrade History
  • Submit Enhancement Request
  • Data Analysis Services
  • Free QI Macros Webinar
  • Free QI Macros Video Tutorials
  • How to Setup Excel for QI Macros
  • Free Healthcare Data Analytics Course
  • Free Lean Six Sigma Webinars
  • Animated Lean Six Sigma Video Tutorials
  • Free Agile Lean Six Sigma Trainer Training
  • Free White Belt Training
  • Free Yellow Belt Training
  • Free Green Belt Training
  • QI Macros Resources
  • QI Macros Knowledge Base | User Guide
  • Excel Tips and Tricks
  • Lean Six Sigma Resources
  • QI Macros Monthly Newsletter
  • Improvement Insights Blog
  • Buy QI Macros
  • Quantity Discounts and W9
  • Hassle Free Guarantee

Lean Six Sigma Templates in QI Macros

Improvement tools, doe, gage rr, fmea, calculators & formulas, planning & pm tools.

Lean Six Sigma Quick Reference Card

QI Macros Reviews CNET Five Star Review Industry Leaders Our Customers

Home » Lean Six Sigma Templates » DOE Software

Looking for Design of Experiments (DOE) Software for Excel?

Qi macros has ready-made doe templates for you... you don't have to be an expert to do doe.

DOE templates in QI Macros

  • Click on QI Macros menu > DOE > Design of Experiments.
  • Conduct tests and enter your data in the template.
  • QI Macros does the rest.

QI Macros Contains These Easy to Use DOE Templates for Excel

  • L16 Taguchi
  • Full Factorial
  • Fractional Factorial 8-run
  • Fractional Factorial 16-run
  • Plackett-Burman 8-run & 8-reflected
  • Plackett-Burman 12-run
  • Plackett-Burman 16-run

Each template contains an "orthogonal array" of the combinations of high and low values to be used in each trial. Conduct your experiments and then drop your data into the yellow shaded input areas. The templates perform the calculations and draw charts and interactions for analysis. It is recommended to have a minimum of (8) Replications.

Examples of QI Macros DOE Templates for Excel

L-4 taguchi input.

L4 Taguchi Input Area

L-4 Taguchi Factors

L-4 TaguchiFactors

L-4 Taguchi Interactions

L-4 TaguchiInteractions

L-8 Taguchi Input

L8 Taguchi Input Area

Fractional Factorial 8-Run

Fractional Factorial 8 Run

Plackett-Burman 12-Run

Plackett Burman 12 Run

Learn More ...

  • Design of Experiments Overview
  • Design of Experiments Example using Popcorn
  • Step by step instructions for how to perform a Design of Experiments

You don't have to be an expert. Use QI Macros DOE Template to mistake-proof your calculations.

Download a free 30-day trial. get the doe template now, other tools included in qi macros add-in for excel.

control charts

  • SPC Software for Excel
  • Free 30 Day Trial
  • On-line Tech Support
  • QI Macros Reviews
  • Free QI Macros Training
  • Privacy Policy

KnowWare International Inc BBB Business Review

KnowWare International, Inc. 2696 S. Colorado Blvd., Ste. 555 Denver, CO 80222 USA Toll-Free: 1-888-468-1537 Local: (303) 756-9144

linked in

Design of Experiments

  • Mixture design in Excel tutorial
  • Surface response design in Excel tutorial
  • Factor effect (screening) design in Excel tutorial
  • Taguchi design in Excel tutorial

Expert Software for Better Insights, Research, and Outcomes

how to do design of experiments in excel

Design of Experiments

Introductory basics, introduction to design of experiments.

The Open Educator Textbook Explanation with Examples and Video Demonstrations

Video Demonstration Only, Click on the Topic Below

What is Design of Experiments DOE?

Hypothesis Testing Basic

Explanation of Factor, Response, dependent, independent, variable

Levels of a Factor

Fixed Factor, Random Factor, and Block

Descriptive Statistics and Inferential Statistics

What is Analysis of Variance ANOVA & Why

p-value & Level of Significance

Errors in Statistical Tests Type 1, Type II, Type III

Hypothesis Testing

How to Choose an Appropriate Statistical Method/Test for Your Design of Experiments or Data Analysis

Single Sample Z Test Application, Data Collection, Analysis, Results Explained in MS Excel & Minitab

Single Sample T Test Application, Data Collection, Analysis, Results Explained in MS Excel & Minitab

Single Proportion Test Application, Data Collection, Analysis, Results Explained MS Excel & Minitab

Two-Sample Z Test Application, Data Collection, Analysis, Results Explained Using MS Excel & Minitab

Two Sample T Test Application, Data Collection, Analysis, Results Explained Using MS Excel & Minitab

Paired T Test Application, Data Collection, Analysis, Results Explained Using MS Excel & Minitab

Two Sample/Population Proportion Test Application, Analysis & Result Explained in MS Excel & Minitab

Completely Randomized Design (CRD)

One-way/single factor analysis of variance, anova.

One Way Single Factor Analysis of Variance ANOVA Completely Randomized Design Analysis in MS Excel

One Way Single Factor Analysis of Variance ANOVA Completely Randomized Design Analysis in Minitab

One Way Single Factor Analysis of Variance ANOVA Post Hoc Pairwise Comparison Analysis in MS Excel

Fixed vs Random Effect Model Explained with Examples Using Excel and Minitab

Randomized Complete Block Design

Randomized Complete Block Design of Experiments RCBD Using Minitab 2020

Latin Square Design Using Minitab Updated 2020

Graeco Latin Square Design Updated 2020

Latin Square and Graeco Latin Square Design

Latin Square and Graeco Latin Square Design Analysis using Minitab

Screening the Important Factors/Variables

Factorial design of experiments.

Introduction to Factorial Design and the Main Effect Calculation

C alculate Two Factors Interaction Effect

Regression using the Calculated Effects

Basic Response Surface Methodology RSM Factorial Design

Construct ANOVA Table from the Effect Estimates

2k Factorial Design of Experiments

The Open Educator Textbook Explanation with Examples and Video Demonstrations for All Topics

Introduction to 2K Factorial Design

Contrast, Effect, Sum of Square, Estimate Formula, ANOVA table

Design Layout and Construction of 2K Factorial Design Using MS Excel

Write Treatment Combinations Systematically and Flawlessly

Contrast, Effect, Estimate, and Sum of Square Calculation Using MS Excel

Comparisons between MS Excel, Minitab, SPSS, and SAS in Design and Analysis of Experiments

Blocking and Confounding in 2k Design

Introduction to Blocking and Confounding

Confounding in Factorial and Fractional Factorial

Blocking and Confounding Using -1/+1 Coding System

Blocking and Confounding Using Linear Combination Method

Multiple Blocking and Confounding, How To

Complete vs Partial Confounding and The Appropriate Use of Them

How Many Confounded Treatments are There in a Multiple Confounded Effects

How to Confound Three or More Effects in Eight or More Blocks

Fractional Factorial Design

What is Fractional Factorial Design of Experiments

The One-Half Fraction Explained in 2K Fractional Factorial Design

Introduction to the Primary Basics of the Fractional Factorial Design

Design Resolution Explained

One-Half Fractional Factorial 2k Design Details Explained

How to Design a One-Half Fractional Factorial 2k Design using MS Excel

One-Quarter Fractional Factorial 2k Design

Design a One-Quarter Fractional Factorial 2k Design Using MS Excel

Calculate and Write All Effects in 2k Factorial Design Systematic Flawless

Write Alias Structure in 2K Fractional Factorial Design

Write Alias Structure in 2K Six Factor Quarter Fraction Factorial Design

Design a One-Eighth Fractional Factorial 2k Design Using MS Excel

2K Alias Structure Solution an Example Solution

Fractional Factorial Data Analysis Example Minitab ( Fractional Factorial DOE Data Analysis Example Document )

Design any Fractional Factorial Design with the Lowest Number of Possible Runs Easiest Method in MS Excel

The Easiest Way to Randomize an Experiment in using MS Excel

Plackett-Burman Fractional Factorial Design Using MS Excel

Plackett Burman Fractional Factorial Design of Experiments DOE Using Minitab

Optimize the Important Factors/Variables

Applied regression analysis.

Simple Linear Regression Analysis Using MS Excel and Minitab

Simple Linear Regression Analysis Real Life Example 1

Simple Linear Regression Analysis Real Life Example 2

Simple Linear Regression Analysis Example Cost Estimation

Linear Regression Diagnostics Analysis

Response Surface Methodology

What is Response Surface Methodology RSM and How to Learn it?

Basic Response Surface Methodology RSM Design and Analysis Minitab

Response Surface Basic Central Composite Design

Response Surface Central Composite Design in MS Excel

Response Surface Design Layout Construction Minitab MS Excel

Response Surface Design Analysis Example Minitab

Multiple Response Optimization in Response Surface Methodology RSM

Box Behnken Response Surface Methodology RSM Design and Analysis Explained Example using Minitab

Is Box Behnken Better than the Central Composite Design in the Response Surface Methodology?

Advanced Complex Mixed Factors

Expected mean square, basics to complex models.

Expected Mean Square All Fixed Factors

Expected Mean Square Random Effect Model

Restricted vs Unrestricted Mixed Model Design of Experiments with Fixed and Random Factors

How to Systematically Develop Expected Mean Square Fixed and Random Mixed Effect Model

How to Systematically Develop Expected Mean Square Random, Nested, and Fixed Mixed Effect Model

Restricted vs Unrestricted Mixed Models, How to Choose the Appropriate Model

Nested, & Repeated Measure, Split-Plot Design

Nested Design

Repeated Measure Design

Split Plot Design

Difference between Nested, Split Plot and Repeated Measure Design

Minitab Analysis Nested, Split Plot, and Repeated Measure Design

Analysis & Results Explained for Advanced DOE Partly Nested, Split-Plot, Mixed Fixed Random Models

Approximate F test | Pseudo F Test for Advanced Mixed Models nested, split plot, repeated measure

Taguchi Robust Parameter Design

Files used in the video.

Data Used in the Video for Robust Parameter Taguchi Design

How to Construct Taguchi Orthogonal Arrays Bose Design Generator

How to Construct Taguchi Orthogonal Arrays Plackett-Burman Design Generator

Taguchi Linear Graphs Possible Interactions

Taguchi Interaction Table Development How to

Video Demonstrations

Robust parameter Taguchi Design Terms Explained

Introduction To Robust Parameter Taguchi Design of Experiments Analysis Steps Explained

Robust Parameter Taguchi Design Signal to Noise Ratio Calculation in MS Excel

Robust Parameter Taguchi Design Example in MS Excel

Robust Parameter Taguchi Design Example in Minitab

How to Construct Taguchi Orthogonal Array L8(2^7) in MS Excel

How to Construct Taguchi Orthogonal Array L9(3^4) in MS Excel

How to Construct Taguchi Orthogonal Array L16(4^5) in MS Excel ( MS Excel file for the Design )

How to Construct Taguchi Orthogonal Array L16(2^15) in MS Excel

How to Construct Taguchi Orthogonal Array L32(2^31) in MS Excel

Construct Any (Taguchi) Orthogonal Arrays upto L36(2^35) in MS Excel

Taguchi Linear Graphs Explained and How to Use Them

Taguchi Triangular Interactions Table Explained and How to Use them in the Design of Experiments

Taguchi Interaction Table Construction Design of Experiments How to

Taguchi Linear Graphs, Interactions Table, Design Resolution, Alias Structure, & Fractional Factorial Design of Experiments

How to Create Robust Parameter Taguchi Design in Minitab

How to perform Robust Parameter Taguchi Static Analysis in Minitab

How to perform Robust Parameter Taguchi Dynamic Analysis in Minitab

How to perform Robust Parameter Taguchi Dynamic Analysis in MS Excel

Robust Parameter Taguchi Dynamic Analysis Regress Method in MS Excel and Minitab

Recommended Texts

General design of experiments.

[The order is based on the Use of the Book]

Hinkelmann, K., & Kempthorne, O. (2007). Design and Analysis of Experiments, Introduction to Experimental Design (Volume 1) . John Wiley & Sons. ISBN-13: 978-0471727569; ISBN-10: 0471727563.

Hinkelmann, K., & Kempthorne, O. (2005). Design and Analysis of Experiments, Advanced Experimental Design (Volume 2) . John Wiley & Sons. ISBN-13: 978-0471551775; ISBN-10: 0471551775.

Montgomery, D. C. (2012). Design and analysis of experiments 8 th /E. John Wiley & Sons. ISBN-13: 978-1118146927; ISBN-10: 1118146921

Box, G. E., J. S. Hunter, et al. (2005). Statistics for experimenters: design, discovery and innovation, Wiley-Interscience.

Kempthorne, O. (1952). The design and analysis of experiments, John Wiley & Sons Inc.

Fisher, R. A., Bennett, J. H., Fisher, R. A., & Bennett, J. H. (1990). Statistical methods, experimental design, and scientific inference. Oxford University Press. ISBN-10: 0198522290; ISBN-13: 978-0198522294.

Regression & Response Surface

Kutner, M. H., Nachtsheim, C. J., Neter, J., & Li, W. (2013). Applied linear statistical models .

Myers, R. H., Montgomery, D. C., & Anderson-Cook, C. M. (2019). Response surface methodology: Process and product optimization using designed experiments . Hoboken: Wiley.

Robust Parameter Optimization

Taguchi design of experiments.

Kacker, R. N., Lagergren, E. S., & Filliben, J. J. (1991). Taguchi’s orthogonal arrays are classical designs of experiments. Journal of research of the National Institute of Standards and Technology , 96 (5), 577.

Plackett, R. L., & Burman, J. P. (1946). The design of optimum multifactorial experiments. Biometrika , 305-325. (for Video #11)

Taguchi, G., Chowdhury, S., Wu, Y., Taguchi, S., & Yano, H. (2011). Taguchi's quality engineering handbook. Hoboken, N.J: John Wiley & Sons.

Chowdhury, S., & Taguchi, S. (2016). Robust Optimization: World's Best Practices for Developing Winning Vehicles. John Wiley & Sons.

Random-Effect Models, Mixed Models, Nested, Split-Plot & Repeated Measure Design of Experiments

Quinn, G. P., & Keough, M. J. (2014). Experimental design and data analysis for biologists . Cambridge: Cambridge Univ. Press.

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Open access
  • Published: 27 August 2024

Enhancing students’ attitudes towards statistics through innovative technology-enhanced, collaborative, and data-driven project-based learning

  • Andreea Cujba 1 &
  • Manoli Pifarré   ORCID: orcid.org/0000-0002-4271-4824 1  

Humanities and Social Sciences Communications volume  11 , Article number:  1094 ( 2024 ) Cite this article

Metrics details

  • Complex networks

Given the substantial body of educational research highlighting the significant influence of student attitudes on academic performance, particularly in disciplines like statistics where anxiety is prevalent, there is a need to investigate how innovative methodologies could reshape these attitudes. This paper will capitalize on the advancements from previously uncombined innovative methodologies of teaching statistics, such as project-based learning, data analytics, collaborative work, or the use of technology. Specifically, this paper reports on the design, implementation, and evaluation of innovative technology-enhanced, collaborative, and data-driven project-based learning, aiming to positively impact students’ attitudes towards statistics as a cornerstone to improve statistical knowledge. To achieve this, a quasi-experimental research study involving 174 secondary students was undertaken, with participants divided into an experimental group (EG) and a control group (CG). Results indicate a notable positive shift in attitudes among EG students following the intervention. The EG students decreased their anxiety after the intervention and, increased their affect and positive attitude toward using technology for learning statistics. By contrast, the CG students do not show any positive effect on their attitudes. These findings underscore the potential of the innovative instructional design implemented in this project to not only foster practical statistical problem-solving skills but also cultivate positive attitudes crucial for statistical competence. Educational implications are discussed.

Similar content being viewed by others

how to do design of experiments in excel

Fostering twenty-first century skills among primary school students through math project-based learning

how to do design of experiments in excel

Students’ performance, attitude, and classroom observation data to assess the effect of problem-based learning approach supplemented by YouTube videos in Ugandan classroom

how to do design of experiments in excel

Applying Q-methodology to investigate computer science teachers’ preferences about students’ skills and knowledge for obtaining a degree

Introduction.

In our increasingly digital world, technology generates vast quantities of data, and with the rise of artificial intelligence, this influx is set to skyrocket. To effectively harness and make sense of this data, citizens need to be equipped with robust data analytic skills. As evidence-based decision-making becomes increasingly imperative, advanced data analytic abilities will be indispensable. However, a significant challenge lies in the fact that many secondary students lack the positive attitudes necessary to engage with and learn data analytics skills and statistics (Garfield and Ben‐Zvi, 2007 ; Szczygieł and Pieronkiewicz, 2021 ).

Previous educational research has confirmed the role of developing positive attitudes to obtaining better results and meaningful learning of mathematics and statistics (e.g., Albelbisi and Yusop, 2018 ; Dowker et al., 2019 ; Muñoz et al., 2018 ; Silva and Sousa, 2020 ).

In the same vein, Emmioğlu and Capa-Aydin ( 2012 ) point out that positive attitudes towards statistics correlate positively with higher students’ results in statistics courses. Furthermore, several studies indicate that most students and adults do not statistically reason about important issues that affect their lives because they have not acquired the necessary skills (Domu et al., 2023 ; Garfield and Ben‐Zvi, 2007 ; Haddar et al., 2023 ; Özmen and Baki, 2021 ). Therefore, many students do not understand the usefulness or application of statistics in real and daily life and develop negative attitudes, e.g., anxiety, towards statistical content (Gal and Ginsburg, 1994 ; Williams, 2015 ). Rejection towards this subject is also accounted for by the widespread and well-known mathematical anxiety (Szczygieł and Pieronkiewicz, 2021 ) due to the student’s perception that statistics posits a great deal of mathematical content, without a real application and is difficult to understand (Gal and Ginsburg, 1994 ).

This paper capitalizes on the advancements from previously uncombined innovative methodologies of teaching statistics, such as project-based learning, data analytics, collaborative work, or the use of technology; and it designs an innovative instructional design to promote positive students’ attitudes towards statistics. Moreover, the paper reports on the implementation, and evaluation of technology-enhanced, collaborative, and data-driven project-based learning and its impact on students’ attitudes towards statistics via a quasi-experimental study. The paper contributes with an innovative pedagogy that combines and integrates the advancements of already testing teaching methods for engaging students in big data analysis, increasing their positive attitudes towards statistics, a cornerstone to improve students’ statistics skills and learning.

Literature review

Attitudes have been broadly defined as not directly observable, inferred aspects consisting of beliefs, feelings, and behavioural predispositions towards the object to which they are directed (Nolan et al., 2012 ). Although the attitude definition is not consistent in the literature, in accordance with the most frequent definitions in research, an attitude is a psychological tendency that is expressed by evaluating a particular entity with some degree of favour or disfavour (Savelsbergh et al., 2016 ). Hence, this psychological tendency is shaped through experience and determines future behaviours. In this line of argument, an attitude can be seen as a personal characteristic that has an influence on subject’s behaviour (Di Martino and Zan, 2015 ).

In the context of learning, attitudes towards mathematics, and statistics in particular, are profound feelings and emotional reactions shaped by students’ experience in solving statistics tasks and throughout time (Tuohilampi, 2016 ). In other words, attitudes toward statistics can be seen as students’ expectations towards this subject, and according to them, the student will have one reaction or another in statistics class (Batanero and Díaz, 2011 ). Math anxiety is the sensation of concern and worry felt when thinking about mathematics or while doing a mathematics task (Abín et al., 2020 ).

Educational research claims that positive attitudes towards mathematics and statistics can be promoted by implementing innovative teaching methods that include, among others, the following five educational variables: (a) student-centred learning; (b) project-based learning and solving real problems or challenges familiar to students; (c) data analytics (henceforth DA) skills; (d) collaborative learning and e) use of interactive technologies (Chew and Dillon, 2014 ; Savelsbergh et al., 2016 ).

In this line of argument, recently, the growth in the everyday use of digital technologies is creating vast reservoirs of data. These data have huge but largely untapped potential. The economic sector has already considered the necessity to understand the “big data” generated in each sector and turn it into insight and action. Therefore, there is an increasing demand for citizens with the skills and creativity capable to perform data-driven decision making (Frischemeier et al., 2022 ). For example, the Guidelines for Assessment and Instruction in Statistics Education (GAISE) Report (Bargagliotti et al., 2020 ) for the pre-K-12 classroom explicitly emphasize the need for innovative instructional programmes about data analytics to teach students to: formulate questions that can be answered using data, learn to collect data, organize data, create graphs and charts with data to answer their questions. In this context, there is a need for studies that innovate and extend best practices in teaching statistics in schools using data analysis and technology-enhancement through a project-based learning approach (Chew and Dillon, 2014 ; Koparan and Güven, 2014 ). Countless investigations point to the positive impact of technology on students’ attitudes, and so technology-driven teaching becomes a useful pedagogical tool for teaching and learning statistics (Emmioğlu and Capa-Aydin, 2012 ; Ramirez et al., 2012 ).

In this line of research, this paper aims to design, implement, and evaluate a technology-enhanced, project-based intervention that could offer secondary students the statistical and digital skills needed to use data to address real-life problems. Specifically, in this paper, we analyse the effects that this technology-enhanced project-based intervention could have on students’ attitudes toward statistics. Our working hypothesis is that students will improve their positive attitudes towards statistics because the technology-enhanced project-based intervention will create a meaningful and positive learning environment that will raise the student's awareness of the role of data, statistics, and technology in many everyday problems.

In the next sections, we revise previous research on the effects of the four uncombined, innovative educational variables in statistics education, namely: (i) project-based learning, (ii) data analytics approach, (iii) use of technology, and (iv) collaborative work. This will be followed by our research study, the results and discussion of our findings and, finally, the educational implications for statistics education.

Project-based learning and data analytics approach

The use of project-based Learning (henceforth PBL) has been increasingly practised globally in schools. This methodology is characterized by the introduction of the following four educational variables: student-centred learning, problem-solving structured in different research phases, contextualized learning contents in real and open-ended challenges and collaborative work (Haatainen and Aksela, 2021 ). In this line, Batanero and Díaz ( 2011 ) claim the importance of contextualizing the data used in real-life problems when designing PBL in statistics. This aspect encourages, firstly, the student's interest and motivation, even more so if they can choose to tackle the problems they are interested in; secondly, students value the relevance of statistics since it can solve real-life problems and facilitate scientific and economic development. Overall, they adhere to the theory that PBL can improve the students’ attitudes toward statistics. In this same line of argument, Santos ( 2016 ) adds to the equation the influential role of digital technologies in solving collaboratively real-life problems and increasing the positive attitudes towards learning statistics to solve a problem in small groups.

Different quasi-experimental studies have reported the benefits of this innovative methodology on students learning and on students’ attitudes and affect towards statistics (Bateiha et al., 2020 ; Chong et al., 2019 ; Özdemir et al., 2015 ; Markulin et al., 2021 ). In these studies, it is reported that PBL methodology promotes the creation of a creative environment, as most students perceived the project to be an easy and enjoyable activity that favours the learning of mathematical concepts as well as the development of key soft skills such as sense of responsibility, communication skills and ability to work in small groups (Özdemir et al., 2015 ). Besides, PBL encourages students to take a more active role by allowing them to take responsibility for and decisions on their own learning process while the teacher guides them through their learning processes, by taking into account their interests (Moreno-Guerrero et al., 2020 ). These PBL characteristics could have a positive impact on students’ attitudes towards statistics (Özdemir et al., 2015 ), and on students’ affect towards learning statistics (Chong et al., 2019 ).

Recently, along with the appearance of interactive technologies, new ways of engaging with real-life data—notably via interactive data visualizations—have emerged and new ways of thinking and learning from complex data have evolved (Engel, 2017 ; Sutherland and Ridgway, 2017 , Rao et al., 2023 ). In this context, various authors have seen the need to develop studies that introduce the perspective of data analytics when designing PBL in teaching statistics (Kazak et al., 2021 ; Zotou et al., 2020 ). From this perspective, data analytics is seen as a process of engaging students creatively in exploring data to understand our world better, draw conclusions, make decisions and predictions, and critically evaluate present/future courses of action (Fujita et al., 2018 ). Data analytics does not focus on learning mathematical procedures but on understanding and interpreting data to solve a real-life problem (Chew and Dillon, 2014 ). Furthermore, data analytics reinforces the active role of students in learning statistics as they must make the effort to focus on the process of understanding and interpreting data to address a real-life problem. The students are encouraged to solve the problem since the teacher acts only as a guide and will not provide them with a solution.

Interactive technologies have been essential in teaching and learning statistics and data analytics. Technologies can provide a creative and interactive environment to represent, visualize and manipulate data in a way that encourages students to think and learn from complex data. In this respect, our educative intervention has designed a technology-enhanced, project-based learning environment that promotes the use of a variety of technological tools for learning key statistical concepts and developing key skills, e.g., explore, understand and interpret data to solve a real problem. In the next section, we will present key studies that have used technology affordances to promote better statistical literacy and positive attitudes toward statistics.

Use of technology to increase the students’ attitudes toward statistics

In the use of technology for teaching mathematics, there is a trend towards constructivist tasks based on research, which supports collaborative approaches, resolution of problems, and the practice of learning by doing. Bray and Tangney ( 2017 ) point this out through a systematic analysis of 139 studies and, in view of the results, conclude that contemporary technologies increase collaboration and allow a practical application of mathematics through visualization, modelling and manipulation. They claim that technologies provide an interactive, dynamic, and contextualized learning of the subject. These technological affordances facilitate experimentation and testing of ideas and manage to change classroom dynamics from the teacher leading the session and transmitting knowledge to more dynamic student-centred research.

Technological tools are also increasingly used in teaching statistics as the means to mediate and promote learning of problem-solving strategies and statistical challenges. Among the affordances of technologies to promote statistical education, Ridgway et al. ( 2017 ) highlight data visualizations as they facilitate interaction with data in a more intuitive, dynamic, and exploratory way. Such software programmes as TinkerPlots (dynamic data exploration, available at https://www.tinkerplots.com/ ) or common online data analysis platform (CODAP, available on http://codap.concord.org ) are widely used to promote statistical literacy and positive attitudes toward statistics. Among the main characteristics of these software programmes, the more salient are the next four: (a) they facilitate modelling activities, in which students can deeply analyse real-world situations through mathematical representations and asking questions, (b) they mediate between conceptual thinking and investigate probability events and identify patterns, (c) they improve intuition about data representation and analysis, and (d) they facilitate the creation of graphs (Gonzalez and Trelles, 2019 ; Kazak et al., 2014 ).

Various authors provide evidence of how the characteristics of technologies such as TinkerPlots, CODAP, and Fathom improve the students’ learning and attitudes. Gonzalez and Trelles ( 2019 ) investigated how a group of 15-year-old students increased their motivation through modelling activities in mathematics through TinkerPlots. In this study, modelling is defined as a learning system that encourages students to ask questions and analyse situations that could be real through mathematics. Other authors agree that the use of technological tools, such as CODAP is essential to develop students’ statistical reasoning (Casey et al., 2020 ; Mojica et al., 2019 ). The ability of CODAP to facilitate working with large data sets makes it easier for students to focus on making decisions about data analysis and reasoning about different forms of data representation, rather than on struggling with computational work, since no programming knowledge is required (Casey et al., 2020 ; Frischemeier et al., 2021 ). In this line, Kazak et al. ( 2014 ) showed how 11-year-olds improved their understanding of statistics with the help of TinkerPlots through collaborative work in small groups. The authors used TinkerPlots as a technology that mediated conceptual thinking to investigate various probability events in statistics and identify patterns. They argued that this software favoured the improvement of the students’ intuition about data representation and analysis and facilitated the creation of graphs.

Many other studies amplify the potential of technology in favouring positive attitudes and learning of mathematics by integrating technology in the classroom along with other teaching and learning strategies that have also proved relevant for improving mathematics learning. Attard and Holmes ( 2020 ) show that new technologies manage to place the student at the centre of the teaching–learning process: technology captures the attention and interest of students by means of immediate instructions and feedback. In addition, technology offers students an additional and different space for communication, beyond the classroom (Attard and Holmes, 2020 ).

The technology-enhanced, project-based study presented in this paper explicitly implements the findings of recent educational research based on supporting classroom dialogue, thinking and collaborative learning. In the next section, we will present these key findings.

Collaborative work

Collaborative work has been embedded in PBL (Fredriksen, 2021 ; Lyons et al., 2021 ; Ozdamli et al., 2013 ; Özdemir et al., 2015 ) and its impact on students’ development of positive attitudes towards mathematical learning is highly reported (Kazak et al., 2014 ; Moreno-Guerrero et al., 2020 ; Özdemir et al., 2015 ). Furthermore, educational research claims that interactive technologies can afford group work and communication and enrich the development of key problem-solving strategies (Kazak et al., 2014 ; Major et al., 2018 ; Noll et al., 2018 ).

Promotion of collaborative learning involves working explicitly on ground rules, interactional processes, and exploratory talk (Mercer, 2019 ). Exploratory talk improves attitudes toward learning as it facilitates the exploration and understanding of content and promotes intersubjectivity between group members when creating jointly new knowledge and understandings (Gómez, 2016 ; Knight and Mercer, 2015 ; Mercer et al., 2019 ). Dialogue is also very important for better organization and management of the group. This aspect is verified by Kazak et al. ( 2014 ) through an intervention based on collaborative work with technology. In this experiment, students were instructed to communicate with their classmates in a dialogical way, following five ground rules: (1) ensuring that all members of the group contribute with ideas; (2) asking classmates for arguments, listening to explanations and making an effort to understand; (3) being interested in what the others think; (4) taking into account different points of view or alternative methods, and (5) trying to reach a consensus before carrying out an action with the computer. This study, whose main objective was to teach key concepts of statistics and probability to 11-year-old students, through qualitative analysis of the dialogues from the groups, concluded that the students improved their communication with and opinions about their classmates. It also proved that their contributions were incorporated and integrated, thus facilitating the consensus of ideas.

Our study aims to contribute to research on the design and application of innovative methods in teaching statistics. To this end, our research took a quasi-experimental approach toward answering the following research question: what are the effects of a collaborative, technology-enhanced and data-driven project-based intervention on students’ attitudes towards statistics? Our general working hypothesis was that the design and implementation of a long-term real-classroom intervention that embeds and combines the three key educative variables for the promotion of statistics education, i.e., collaborative learning, technology-enhanced learning, and project-based learning, would have a positive impact on the students’ attitudes towards statistics. Furthermore, our expectations were that those students who received a collaborative, technology-enhanced project-based intervention would improve their attitudes towards statistics unlike their counterparts who followed a regular standard curriculum.

Our research aims to confirm or reject the next four hypotheses:

H1. Students following the collaborative, technology-enhanced, data-driven project-based intervention (henceforth SPIDAS) will improve their global attitude towards statistics. This increment will be higher than their counterparts who follow a traditional intervention.

H2. Students following the SPIDAS intervention will decrease their anxiety towards statistics, unlike their counterparts who follow a traditional intervention.

H3. Students following the SPIDAS intervention will increase their affect towards statistics more than their counterparts who follow a traditional intervention.

H4. Students following the SPIDAS intervention will improve their attitude towards learning statistics with technology more than their counterparts who follow a traditional intervention.

This research is part of a larger EU ERASMUS+ project called International Strategic Partnership for Innovative in Data Analytics in Schools (SPIDAS henceforth) aiming to innovate and extend best practices in data analytics in schools. In this paper, we will report only on one aspect of the ERASMUS+ project; with an eye on analysing the impact of a SPIDAS educational intervention on students’ attitudes towards learning statistics, a quasi-experimental design was planned in which an experimental group (henceforth EG) followed the SPIDAS instruction, and a control group (henceforth CG) followed the traditional education method.

Readers can learn more about the design, implementation, and multi-method evaluation of the statistics innovative instructional carried out in this Erasmus+ project in Cujba and Pifarré ( 2023 , 2024 ) and in https://spidasproject.org.uk/ web site.

Participants

A total of 174 students from two Spanish private publicly funded schools in the 8th grade (13–14 years old) participated, either as part of the experimental group (EG) or the control group (CG). 110 students belonging to the EG and had a homogeneous gender distribution: 52.7% (58) of them were girls and 47.3% (52) of them were boys. In the CG participated 64 students and the gender distribution was also homogeneous: 53.12% (34) of them were girls, and 46.88% (30) of them were boys. Additionally, both schools had similar medium socioeconomic characteristics, and the sample demonstrated a comparable level of general academic achievement, as evidenced by the results of the National Test of Basic Skills. Several studies showed a significant correlation between socioeconomic status and academic achievement, being negative in schools with lower socioeconomic status backgrounds (Berkowitz et al., 2017 ).

Additionally, the study assessed participants’ prior statistical knowledge, uncovering a notable deficiency in this area (Cujba and Pifarré, 2023 ). For further insights into the beneficial effects of the innovative instructional design detailed in this paper on enhancing students’ statistical knowledge, readers are encouraged to explore the Cujba and Pifarré ( 2023 ) findings.

Materials and procedure

Following previous research in the area (e.g., Nolan et al., 2012 ), this study evaluates the students’ attitudes towards statistics with technology using a questionnaire developed and validated exploratory into Spanish (Cujba and Pifarré, 2024 ) . In synthesis, the validation process consisted of three steps: firstly, the questionnaire development was based on a thorough revision of previous international questionnaires. Secondly, a double back-translation of the original items was carried out and followed a consensus process among expert judges’ methodology (content validity). Thirdly, the questionnaire developed was applied and tested to a sample of 254 13/14-year-old Spanish Secondary Education students. As a result of this process, a three-factor structure (namely anxiety, learning statistics with technology and effect) was found through exploratory factor analysis using the varimax rotation with the SPSS programme. Evidence of internal consistency was provided with an α  = 0.83 (“anxiety” factor α  = 0.83; “learning statistics with technology” factor α  = 0.76; “affect” factor α  = 0.77). The results showed suitable psychometric properties to use the questionnaire to evaluate secondary education students’ attitudes toward statistics with technology in the Spanish language.

The final version of the questionnaire contains 16 items structured along three factors: anxiety, learning statistics with technology, and affect. The questionnaire was applied to the 174 students who participated in this study at two different moments: before and after the educational intervention. The students were tested individually, and their attitudes were evaluated using a Likert scale of 4 options: 1 = Totally disagree, 2 = Disagree, 3 = Agree, and 4 = Totally agree. Scores on all negatively worded items (i.e., anxiety factor items) were reversed prior to data analysis.

Experimental group (EG) intervention: the collaborative, technology-enhanced data-driven project-based intervention—SPIDAS

The EG educational intervention lasted 30 h, distributed over 2 months. Students completed a real-life statistical project on how the weather influences daily activities, a topic of great current interest that requires well-argued, data-based answers. Students worked in small groups of 3–4, combining on-site classroom activities with work outside the classroom. For the outside work, the groups collaborated synchronously on shared documents using the Google Drive platform.

SPIDAS project incorporates and combines innovatively the three key pedagogical axes considered by the literature review as relevant in promoting data analysis skills in students, namely: (1) data-driven project-based learning, (2) collaborative learning and (3) the use of technology to help to learn statistics through visual learning (Frischemeier et al., 2021 ). In our project, we used the open-source software CODAP. This software has the same attributes as TinkerPlots. Both software are easy to use by inexperienced users, allow flexible plot creation, deal with data as a first-order persistent object, support an exploratory and confirmatory analysis, and are very interactive (McNamara, 2018 ). As a disadvantage, TinkerPlots needs previous installation and a payment license per computer.

Next, we describe how the three pedagogical axes were incorporated into the SPIDAS educational intervention. Fig. 1 graphically illustrates how the SPIDAS intervention leverages and combines the advancements of these three innovative methods for teaching statistics. Readers can learn more about the SPIDAS educational intervention in: https://spidasproject.org.uk .

figure 1

SPIDAS intervention.

Data-driven project-based learning : The SPIDAS intervention explicitly incorporates the four educational variables highlighted in the PBL literature review (e.g., Batanero and Díaz, 2011 ; Bateiha et al., 2020 ; Haatainen and Aksela, 2021 ): (a) structure of students’ learning process in enquiry phases; (b) statistical literacy contextualized in real and daily life; (c) active role of the students and guidance role of the teacher and (d) development of ‘data analytics (DA) cycle’ drawn on PPDAC statistical enquiry cycle (Wild and Pfannkuch, 1999 ), statistical thinking process (Wild et al., 2011 ) and informal statistical inference (Makar and Rubin, 2018 ). In synthesis, the main tasks of the data-driven project are presented in Table 1 . To give more information about the instructional design, Table 1 and Figs. 2 – 5 present examples of activities of one group of students.

figure 2

Example of Define the problem activity.

figure 3

Example of students’ graph for Explore data .

figure 4

Example of Draw conclusions activity.

figure 5

Example of Make decisions activity. It is presented students’ infographic to communicate their project decisions and conclusions.

Collaborative learning : Students worked in small groups during the whole project and were encouraged to actively create, reflect and evaluate ideas by using effective communication skills and ground-rules. Three specific strategies from “Thinking Together” programme (Mercer et al., 2019 ) for promoting good small-group work and exploratory talk were explicitly taught and these are (a) reflection on group roles, (b) reflection on attitudes and behaviours that promote collaborative learning, and (c) development of effective ground rules.

Technology : The SPIDAS project used two types of technologies: CODAP data analysis software and different applications linked to Google Drive. CODAP software allows graphical visualization of data and supports students understanding and interpretation of their data. Due to the CODAP interactive and manipulative design, students can actively explore their own data and obtain meaningful graphical representations that could help draw data-based conclusions (e.g., Fig. 3 ). Considering the extensive research about the role of interactive technologies (Major et al., 2017 ; Pifarré, 2019 ) to enhance collaborative work and dialogic discussions, some Google Drive applications (such as Docs, Slides) were used. These applications allow the creation of synchronic and multi-user workspaces that in our project fostered four key processes of collaborative work with technology: (a) discussion of shared ideas; (b) co-construction of new ideas, planning and reflection on joint work; (c) support to the development of statistics literacy and (d) enrichment of data analytics strategies such as organization of data, manipulation of data, creation of graphs, analysis of data and making decisions based on data.

The innovative instructional design advocated for student-centred methodologies, wherein teachers adopted a dual role: part lecturer, part coach, fostering collaborative learning within student groups. They facilitated the data analytics projects undertaken by each group, providing guidance throughout the process and supporting them in drawing conclusions and making informed decisions based on their analyses.

Control group (CG) intervention

The CG followed a traditional intervention. It also lasted 2 months. It was a teacher-centred intervention and the teacher mainly used lectures to teach the same statistical concepts taught in the EG. The statistical literacy taught integrates the next concepts: mean, median, mode, range, variability, qualitative and quantitative variables, frequency, proportional reasoning, count, reading graphs, sample, and population. CG students followed the lectures, had a passive role, paid attention to the explanations of the teachers and applied what they had learned by carrying out a series of individual and routine exercises. Unlike the EG, in the CG all the students worked with the same data provided by the teacher. Some of these exercises were carried out outside the classroom, as homework and these exercises were solved individually.

Regarding the use of technology, Excel software was used. This technological tool consists of a spreadsheet that allows calculations and graph creation. Working with this software requires being knowledgeable about the mathematical operations necessary to execute the desired parameters. We believe that with Excel, students should invest more time in understanding which calculations and formulas they need to apply and how, rather than lesson analysing and interpreting data. The teachers explained in class the operations that were to be carried out with Excel, so that, at home, the students could carry out most of the activities. These activities contained real-life data, yet lacked contextualization of the problem or daily life situation.

Data analysis

In order to analyse the sample normality a Shapiro–Wilk statistical test was run with SPSS. Due to the sample is not normally distributed, non-parametric tests were used. On one hand, for comparing the intragroup differences between the pre-test and post-test results, the Wilcoxon test was established. On the other hand, Mann–Whitney U test was used to analyse the intergroup differences between the post-test results in both groups (experimental vs. control).

This section will analyse the effects of the technology-enhanced, collaborative and data-driven project-based learning on four variables (or factors) included in the questionnaire on the students’ attitudes towards statistics, namely: (a) global attitude towards statistics learning; (b) anxiety towards statistics; (c) affection towards statistics; and (d) attitude towards statistics with technology. The global attitude score resulted from the sum of the 16 items included in the questionnaire ( Annex ). Similarly, the score for each factor was calculated by adding the ratings of all the items that composed each factor. Therefore, the factors of anxiety (i5, i7, i10, i11, i12) and affection (i1, i3, i8, i14, i15) contained five items each and the statistics learning with the technology factor contained six items (i2, i4, i6, i9, i13, i16). The Likert scale was a four-point scale: 1 (strongly disagree), 2 (disagree), 3 (agree), or 4 (strongly agree).

Intervention effect on intragroup differences

Wilcoxon analyses were carried out to study the effect of the educational interventions on each experimental (EG and CG) group of students’ attitudes toward statistics. The effect size was checked with Cohen’s d statistic. Table 2 summarizes these results. Experimental group students showed significant differences ( α  = 0.05) in their global attitude score. Besides, experimental group students displayed significant scoring differences in the anxiety and technology factors. Although there was a positive trend in the affection factor, no statistical significance was found.

The control group students did not show significant differences neither in global attitude scores nor in any of the three factors analysed.

Intervention effect on intergroup differences

A Mann–Whitney U test analysis was carried out to investigate the differences between the two groups, before and after students’ participation in the SPIDAS intervention. The effect size was checked with Cohen’s d statistic. Table 3 summarizes these results. These analyses display significant differences ( α  = 0.05) between EG and CG students as regards their global attitude score and in three factors of the questionnaire: namely Anxiety, Affection, and use of Technology. Figure 6 displays the mean scores obtained by the two groups in pre- and post-measurements of the different factors of the questionnaire and the significant statistical differences observed in the analysis.

figure 6

Pretest and posttest results and statistical differences between EG and CG students.

Furthermore, these results firstly show that the experimental group obtained higher mean post-test scores than the control group regarding the overall attitude questionnaire towards statistics (see Fig. 6 ). The experimental group presented a higher global attitude score in the questionnaire both before and after the intervention. Although before the intervention the global attitude of the EG students was already higher than that obtained by the CG, the post-intervention improvement was greater and statistically significant in the EG and not in the CG, whose improvement was hardly perceived. This result allows us to conclude that the SPIDAS intervention had a positive impact on the student's attitudes toward learning statistics with technology.

Secondly, EG students significantly decreased their levels of anxiety toward learning statistics after their participation in the SPIDAS intervention, unlike the CG students, who showed a statistically low decrease in this variable. The difference in the post-measure between both groups was statistically significant, being the EG score higher than that of CG students (Fig. 6 ).

Thirdly, regarding the affection factor, in EG students there was a tendency to improve this factor after the SPIDAS intervention. The post-test value was higher than that of the pre-test in this group. On the other hand, in the CG the post score in the affection factor decreased. Therefore, the impact of the traditional intervention had a negative impact on the student’s perception of their abilities to learn and solve statistical problems. The comparison between the two groups (see Fig. 6 ) yielded statistically significant differences between the EG and the CG, in the pre- and post-measures. In addition, the EG showed a higher score in the post-measure while the CG decreased its post-score. This result increased the differences between the two groups in this variable. These data allow us to conclude that the impact of the SPIDAS intervention also had a positive impact on the affection variable towards learning statistics.

Finally, with respect to technology, EG students significantly improved their attitude toward learning statistics with technology after the SPIDAS intervention, unlike the CG students who hardly showed any improvement. The comparison of post-intervention scores between both groups (see Fig. 6 ) as regards the technology factor shows statistical differences between both groups, namely, EG students obtained higher scores in this factor compared to CG students. Thus, the technology-enhanced intervention positively influenced the students’ attitude towards learning statistics.

Discussion and conclusions

The main objective of this study was to investigate the effects of technology-enhanced, collaborative, and data-driven project-based learning on the students’ attitudes towards statistics. This study distinguishes itself from other PBL studies on statistics because we investigated a long-term intervention in real classrooms and integrated into one intervention the three pedagogical variables that previous research highlighted as relevant in statistics education: (a) the use of technological tools and affordances for analysing and visualizing data, (b) enrichment of collaborative strategies and (c) project-based learning with a data analysis approach.

Results show that the designed SPIDAS intervention had a highly positive impact on the students’ attitudes toward statistics (Hypothesis 1) and on student’s affect towards learning statistics (Hypothesis 3). Our results support those obtained by other authors that indicate that the PBL is an innovative methodology that meets the necessary characteristics to improve students’ attitudes and increase their interest and motivation towards statistics, as the active role of students in investigating real-life questions is notable (Koparan and Güven, 2014 ; Siswono et al., 2018 ). The EG intervention granted students an active role in investigating a problem that both captured their interest and was related to a daily problem, with the added use of technology. According to Siswono et al. ( 2018 ), improvement in learning statistics is greater if the PBL is combined with technology.

Furthermore, our SPIDAS intervention explicitly taught students a series of collaborative strategies, such as the assumption of different roles, drawing up a joint work plan, managing time for solving problems, distributing responsibilities and co-evaluating group work. Previous educational research highlights that the improvement of the organization and management strategies of group work has a positive impact on the better functioning of small group work and on the creation of positive synergy between group members, which in turn has a positive impact on the all-students’ attitudes towards learning (Chang and Brickman, 2018 ; Pai et al., 2015 ). In this respect, our study confirms these results.

Unlike the results obtained by the EG students, the traditional intervention followed by the CG did not improve the students’ attitudes towards statistics. Previous research points out that one of the limitations of traditional teaching is that it does not contextualize statistical concepts with real-life situations and thus, students cannot establish meaningful links with daily life problems. This has a negative impact on the students’ attitudes towards learning statistics (Bateiha et al., 2020 ; Hwa, 2018 ; Özdemir et al., 2015 ). Our CG students learned statistical concepts focusing on mathematical procedures and calculations of sets of data that bore no relation with their real-life context. This makes it difficult for students to build meaningful learning and feel that statistical concepts could be useful for them outside the school context and for solving daily problems (Lalayants, 2012 ). Different studies highlight that the practical use of curricular content in collaboration with peers and with the teacher’s guidance is one of the key elements that can explain students’ learning (Andrade and Chacón, 2018 ; Torrecilla, 2018 ). These elements are emphasized in the experimental intervention and, on the contrary, are not usually part of a more traditional teaching.

Despite the positive results obtained by EG in all the factors of the questionnaire, it was revealed that for both groups—EG and CG—there was a certain lack of interest of the students towards statistics (item 3 → EG p  = 0.007; CG p  = 0.047) and they did not get to enjoy learning statistics (item 1 → EG p  = 0.081; CG p  = 0.104), since these items present lower scores in post-intervention measure than the pre-test measure. Also, technology did not make the learning of statistics more interesting (item 16 → EG p  = 0.976; CG p  = 0.50). Despite this negative perception of learning statistics, the actual improvement in EG compared to CG is remarkable as it is supported by statistically significant differences.

One possible explanation for the slight improvement in the EG students’ affection may be accounted for in that the PBL involves more complex procedures, more effort, and more time in fulfilling the tasks than traditional teaching to learn the content (Koparan and Güven, 2014 ). Being the first experience of the students with a PBL, they may need being involved in further and longer-term PBL experiences to present better and more positive perceptions about learning statistics. Therefore, more long-term studies using innovative methodologies are needed to investigate more about the impact of these methodologies on students’ attitudes and affection in learning statistics (Siswono et al., 2018 ).

Our results also reveal that the SPIDAS instruction had a positive impact on the decrease of the students’ anxiety towards learning statistics (H2). In addition, our results show that the educational intervention followed by the EG had a greater impact than the traditional intervention followed by the control group in reducing students’ anxiety towards learning statistics. These findings are consistent with those found in previous research studies, which claimed a strong relationship between mathematical anxiety, motivation and mathematical achievement (Abín et al., 2020 ; Henschel and Roick, 2017 ; Passolunghi et al., 2016 ).

On the other hand, unlike the results obtained with EG students, the CG students’ anxiety may not have decreased because they learned statistical contents with insufficient context and application to real life. In this line of argument, Lalayants ( 2012 ) claimed in his study that the fear felt by a group of university students towards learning statistics was caused mainly by the lack of connection between their studies and statistics. Basically, they did not understand how to apply the content to real-life situations. The same group of university students declared in the questionnaire that it would have helped them reduce anxiety if they had found any of the following aspects in their statistics classes, i.e., practical real-life problem-solving related to their future profession, teachers who cared about their negative feelings, working sessions with technology, and collaborative work in small groups, as opposed to individual work.

Although students decreased their anxiety towards statistics after their participation in the SPIDAS intervention, students still expressed certain levels of anxiety when doing statistics (item 11 of the questionnaire → EG p  = 0.069; CG p  = 0.661). On this issue, some authors defend that a low level of anxiety toward statistics is not necessarily totally negative. In some cases, a low level of anxiety can motivate students not to give up and continue working to understand the content (Çiftçi, 2015 ).

These results coincide with those found in previous studies that indicate the benefits of implementing a collaborative and student-centred learning methodology (Bateiha et al., 2020 ) and carrying out contextualized activities involving real-life problems (Chong et al., 2019 ) contributed to the students’ increased affection. These characteristics were included in the SPIDAS intervention and, therefore, helped to improve the EG students’ value judgments and motivation in the present study.

Regarding the intensive use of a variety of technological tools to learn key statistical concepts and data analysis skills (Hypothesis 4), our study indicates a statistically significant improvement in the EG students’ attitude towards learning statistics with technology. CG students do not show any progress on this variable. Therefore, the designed SPIDAS intervention confirms that the combination of the use of technology and collaborative work in small groups are powerful pedagogical tools to improve students’ attitudes towards statistics.

These results are consistent with those found by various authors, such as Kazak et al. ( 2014 ), who present the use of the TinkerPlots software as an enabling technology for understanding statistical concepts. Other authors conclude that new technologies encourage collaboration, motivation and facilitate the performance of student-centred activities, a combination that improves students’ attitudes toward learning statistics (Attard and Holmes, 2020 ; Bray and Tangney, 2017 ; Gonzalez and Trelles, 2019 ; Moreno-Guerrero et al., 2020 ).

It is worth noting the higher increase shown by EG in comparison with CG in Item 6 (EG p  = 0.000; CG p  = 0.031). In our view, this result suggests that the use of CODAP throughout the SPIDAS intervention supported the students’ creation of useful data visualizations and students’ learning of data analytics skills. CODAP software facilitates learning, and this has a noticeable positive impact on student attitudes (Woodard et al., 2020 ).

As a final conclusion, this study suggests that implementing technology-enhanced, collaborative, and data-driven project-based learning can provide the basis for an appropriate teaching approach to improve secondary students’ attitudes toward statistics, to have a positive impact on the student's motivation to learn statistics and on the reduction of anxiety in solving problems about this subject. In light of the results of this study, these three educative variables should be considered and included in the design of educative interventions that have the objective to engage students in data analytics in which students are able to select a real problem to investigate, collect and explore appropriate data, make inferences and discuss their conclusions using a data-based approach.

The study shows some limitations that call for further research. Firstly, it was the first interaction of students with statistics, the use of CODAP software in PBL and this learning approach requires a high cognitive implication from students during the learning process (Ge and Chua, 2019 ). This novelty may cause a cognitive overload that could reduce the impact of the innovative intervention on the students’ attitudes toward statistics. Therefore, the design of longer interventions with a longitudinal research approach capable of improving the students’ attitudes over a longer period of time would probably soften the impact of cognitive overload.

Secondly, our study has revealed that despite the innovative intervention, some EG students still feel anxiety towards learning statistics. Despite this, some research claims that a certain level of anxiety can boost students’ positive actions to not give up and, as a result, successfully fulfil a task (Çiftçi, 2015 ). Therefore, it would be interesting to design more qualitative research methods capable of capturing and measuring positive levels of anxiety for learning statistics.

Thirdly, as previous studies noted the relevance of socioeconomic status in academic achievement (Berkowitz et al., 2017 ), for future research, the socioeconomic level of the students will be considered as an independent variable.

Fourthly, the questionnaire used in the study has been useful for evaluating students’ attitudes toward statistics using technology. However, the questionnaire needs further validation to extend the results to other contexts. To this end, as a future research action, we plan to expand the sample, analyse its external validity, and compare the results of the confirmatory factor analysis obtained in this study against other samples (such as students from other courses of Secondary Education, Upper Secondary Education and even university students).

The overall results found in this study are promising for improving students’ data analysis competences with technology and they can be seen as a contribution to the United Nations Education 2030 Agenda which emphasises the need to equip all students with technological and mathematical knowledge. By following this agenda, it is expected that a greater number of students will reach the minimum levels of knowledge in mathematics.

Data availability

The data supporting this study’s findings are available from the corresponding author ([email protected]), upon reasonable request. The data are not publicly available because they contain information that could compromise the privacy of research participants.

Abín A, Núñez JC, Rodríguez C et al (2020) Predicting mathematics achievement in secondary education: the role of cognitive, motivational, and emotional variables. Front Psychol 11. https://doi.org/10.3389/fpsyg.2020.00876

Albelbisi NA, Yusop FD (2018) Secondary school students’ use of and attitudes toward online mathematics homework. Turk Online J Educ Technol 17(1):144–153. https://files.eric.ed.gov/fulltext/EJ1165745.pdf

Google Scholar  

Andrade E, Chacón E (2018) Implicaciones teóricas y procedimentales de la clase invertida [Theoretical and procedural implications of flipped classroom]. Pulso 41:251–267

Article   Google Scholar  

Attard C, Holmes K (2020) It gives you that sense of hope: an exploration of technology use to mediate student engagement with mathematics. Heliyon 6(1):e02945. https://doi.org/10.1016/j.heliyon.2019.e02945

Article   PubMed   PubMed Central   Google Scholar  

Bargagliotti A, Franklin C, Arnold P et al (2020) Pre-K12 Guidelines for Assessment and Instruction in Statistics Education (GAISE) report II. American Statistical Association and National Council of Teachers of Mathematics

Batanero C, Díaz C (2011) Estadística con proyectos [Statistics with projects]. Universidad de Granada

Bateiha S, Marchionda H, Autin M (2020) Teaching style and attitudes: a comparison of two collegiate introductory statistics classes. J Stat Educ 28(2):154–164. https://doi.org/10.1080/10691898.2020.1765710

Berkowitz R, Moore H, Astor RA et al. (2017) A research synthesis of the associations between socioeconomic background, inequality, school climate, and academic achievement. Rev Educ Res 87(2):425–469. https://doi.org/10.3102/0034654316669821

Bray A, Tangney B (2017) Technology usage in mathematics education research—a systematic review of recent trends. Comput Educ 114:255–273. https://doi.org/10.1016/j.compedu.2017.07.004

Casey S, Hudson R, Harrison T et al. (2020) Preservice teachers’ design of technology-enhanced statistical tasks. Contemp Issues Technol Teach Educ 20(2):269–292

Chang Y, Brickman P (2018) When group work doesn’t work: insights from students. CBE—Life Sci Educ 17(3):ar52. https://doi.org/10.1187/cbe.17-09-0199

Chew PK, Dillon DB (2014) Statistics anxiety update: refining the construct and recommendations for a new research agenda. Perspect Psychol Sci 9(2):196–208. https://doi.org/10.1177/1745691613518077

Article   PubMed   Google Scholar  

Chong FMS, Shahrill M, Li HC (2019) The integration of a problem-solving framework for Brunei high school mathematics curriculum in increasing student’s affective competency. J Math Educ 10(2):215–228

Çiftçi SK (2015) Effects of secondary school students’ perceptions of mathematics education quality on mathematics anxiety and achievement. educational. Educ Sci: Theory Pract 15(6):1487–1501. https://doi.org/10.12738/estp.2015.6.2829

Cujba A, Pifarré M (2023) Relaciones entre el aprendizaje de la estadística y las actitudes del alumnado en el marco de un proyecto de análisis de datos con tecnología. Educ Mat 35(2):196–225. https://doi.org/10.24844/em3502.08

Cujba A, Pifarré M (2024) Validación exploratoria de un cuestionario de actitudes hacia la estadística con tecnología. Campus Virtuales 13(1):47–58. https://doi.org/10.54988/cv.2024.1.1266

Di Martino P, Zan R (2015) The construct of attitude in mathematics education. In: Pepin B, Roesken-Winter B (eds) From beliefs to dynamic affect systems in mathematics education. Springer, Switzerland, pp. 269–277

Domu I, Pinontoan KF, Mangelep NO (2023) Problem-based learning in the online flipped classroom: its impact on statistical literacy skills. J Educ e-Learn Res 10(2):336–343. https://doi.org/10.20448/jeelr.v10i2.4635

Dowker A, Cheriton O, Horton R et al. (2019) Relationships between attitudes and performance in young children’s mathematics. Educ Stud Math 100(3):211–230. https://doi.org/10.1007/s10649-019-9880-5

Emmioğlu ESMA, Capa-Aydin YESIM (2012) Attitudes and achievement in statistics: a meta-analysis study. Stat Educ Res J 11(2) https://iase-web.org/documents/SERJ/SERJ11(2)_Emmioglu.pdf

Engel J (2017) Statistical literacy for active citizenship: a call for data science education. Stat Educ Res J 16(1):44–49. https://doi.org/10.52041/serj.v16i1.213

Fredriksen H (2021) Investigating the affordances of a flipped mathematics classroom from an activity theoretical perspective. Teach Math Appl: Int J IMA 40(2):83–98. https://doi.org/10.1093/teamat/hraa011

Frischemeier D, Biehler R, Podworny S et al (2021) A first introduction to data science education in secondary schools: teaching and learning about data exploration with CODAP using survey data. Teach Stat 43:S182–S189. Wiley Editorial https://doi.org/10.1111/test.12283

Frischemeier D, Kazak S, Leavy A et al (2022) International perspectives on early statistical thinking: comparison of primary school curricula in different countries. In Peters SA, Zapata-Cardona L, Bonafini F, Fan A (eds) Bridging the gap: empowering & educating today’s learners in statistics. Proceedings of the 11th International Conference on Teaching Statistics (ICOTS11 2022), Argentina

Fujita T, Kazak S, Turmo MP et al (2018) Strategic partnership for innovative in data analytics in schools. https://blogs.exeter.ac.uk/spidasatexeter/files/2019/01/State_of_Art_Review_partA_final_draft_V3.pdf

Gal I, Ginsburg L (1994) The role of beliefs and attitudes in learning statistics: towards an assessment framework. J Stat Educ 2(2) https://doi.org/10.1080/10691898.1994.11910471

Garfield J, Ben‐Zvi D (2007) How students learn statistics revisited: a current review of research on teaching and learning statistics. Int Stat Rev 75(3):372–396. https://doi.org/10.1111/j.1751-5823.2007.00029.x

Ge X, Chua BL (2019) The role of self‐directed learning in PBL: implications for learners and scaffolding design. In: Moallem M, Hung W, Dabbagh N (eds) The Wiley handbook of problem‐based learning. pp 367–388

Gómez LF (2016) Intención y competencia pedagógica: el uso del aprendizaje colaborativo en la asignatura de matemáticas en secundaria [Intention and pedagogical competence: use of collaborative learning in the subject of mathematics in secondary school]. Prop Represent 4(2):133–179. https://doi.org/10.20511/pyr2016.v4n2.121

Gonzales N, Trelles C (2019) Mathematical modeling and Tinker Plots in solving problems. In: 2019 XIV Latin American Conference on Learning Technologies (LACLO). San Jose Del Cabo, Mexico, pp. 367–374. https://doi.org/10.1109/LACLO49268.2019.00068

Haatainen OM, Aksela M (2021) Project-based learning in integrated science education: active teachers’ perceptions and practices. LUMAT: Int J Math Sci Technol Educ 9(1):149–173. https://doi.org/10.31129/LUMAT.9.1.1392

Haddar AG, Hendriyanto D, Munandar H et al. (2023) Analysis of the effectiveness of project steam-based learning model to improve students’ critical thinking skills. Community Dev J 4(5):10519–10525

Henschel S, Roick T (2017) Relationships of mathematics performance, control and value beliefs with cognitive and affective math anxiety. Learn Individ Differ 55:97–107. https://doi.org/10.1016/j.lindif.2017.03.009

Hwa SP (2018) Pedagogical change in mathematics learning: harnessing the power of digital game-based learning. J Educ Technol Soc 21(4):259–276

Kazak S, Fujita T, Pifarré MT (2021) Students’ informal statistical inferences through data modeling with a large multivariate dataset. Math Think Learn 25(1):23–43. https://doi.org/10.1080/10986065.2021.1922857

Kazak S, Fujita T, Wegerif R (2014) Year six students’ reasoning about random “bunny hops” through the use of TinkerPlots and peer-to-peer dialogic interactions. In Makar K, de Sousa B, Gould R (eds) Proceedings of the 9th International Conference on Teaching Statistics (ICOTS 9). Flagstaff, AZ, USA. Voorburg, The Netherlands: International Statistical Institute

Knight S, Mercer N (2015) The role of exploratory talk in classroom search engine tasks. Technol Pedagog Educ 24(3):303–319. https://doi.org/10.1080/1475939X.2014.931884

Koparan T, Güven B (2014) The effect of project based learning on the statistical literacy levels of student 8th grade. Eur J Educ Res 3(3):145–157. https://doi.org/10.12973/eu-jer.3.3.145

Lalayants M (2012) Overcoming graduate students’ negative perceptions of statistics. J Teach Soc Work 32(4):356–375. https://doi.org/10.1080/08841233.2012.705259

Lyons KM, Lobczowski NG, Greene JA et al. (2021) Using a design-based research approach to develop and study a web-based tool to support collaborative learning. Comput Educ 161:104064. https://doi.org/10.1016/j.compedu.2020.104064

Major L, Haßler B, Hennessy S (2017) Tablet use in schools: impact, affordances and considerations. In: Handbook on digital learning for K-12 schools. Springer, Cham, pp. 115–128

Major L, Warwick P, Rasmussen I et al. (2018) Classroom dialogue and digital technologies: a scoping review. Educ Inf Technol 23(5):1995–2028. https://doi.org/10.1007/s10639-018-9701-y

Makar K, Rubin A (2018) Learning about statistical inference. In: Ben-Zvi D, Makar K, Garfield J (eds) International handbook of research in statistics education. Springer International handbooks of education. Springer International Publishing, pp 261–294

Markulin K, Bosch M, Florensa I (2021) Project-based learning in Statistics: a critical analysis. Caminhos Educ Mat Rev 11(1):200–220

McNamara A (2018) Key attributes of a modern statistical computing tool. Am Stat 1–30 https://doi.org/10.1080/00031305.2018.1482784

Mercer N (2019) Language and the joint creation of knowledge: the selected works of Neil Mercer. Routledge

Mercer N, Hennessy S, Warwick P (2019) Dialogue, thinking together and digital technology in the classroom: some educational implications of a continuing line of inquiry. Int J Educ Res 97:187–199. https://doi.org/10.1016/j.ijer.2017.08.007

Mojica GF, Barker H, Azmy CN (2019) Instrumented learning in a CODAP-enabled learning environment. In: Contreras JM, Gea MM, López-Martín MM, Molina-Portillo E (eds) Actas del Tercer Congreso Internacional Virtual de Educación Estadística. CIVEEST

Moreno-Guerrero AJ, Rondon GM, Martinez HN et al. (2020) Collaborative learning based on Harry Potter for learning geometric figures in the subject of mathematics. Mathematics 8(3):369. https://doi.org/10.3390/math8030369

Muñoz JM, Arias MA, Mato MD (2018) Elementos predictores del rendimiento matemático en estudiantes de Educación Secundaria Obligatoria [Elements of mathematical predictive performance in compulsory secondary education students]. Profesorado 22:391–408. https://doi.org/10.30827/profesorado.v22i3.8008

Nolan MM, Beran T, Hecker KG (2012) Surveys assessing students' attitudes toward statistics: a systematic review of validity and reliability. Stat Educ Res J 11(2):103–123

Noll J, Clement K, Dolor J et al. (2018) Students’ use of narrative when constructing statistical models in TinkerPlots. ZDM 50(7):1267–1280. https://doi.org/10.1007/s11858-018-0981-x

Ozdamli F, Karabey D, Nizamoglu B (2013) The effect of technology supported collaborative learning settings on behaviour of students towards mathematics learning. Procedia-Soc Behav Sci 83:1063–1067. https://doi.org/10.1016/j.sbspro.2013.06.198

Özdemir AS, Yildiz F, Yildiz SG (2015) The effect of project based learning in “ratio, proportion and percentage” unit on mathematics success and attitude. Eur J Sci Math Educ 3(1):1–13

Özmen ZM, Baki A (2021) Statistics instructors’ perceptions of statistics literacy in different undergraduate programs. Int J Res Educ Sci 7(3):852–871. https://doi.org/10.46328/ijres.1817

Pai HH, Sears DA, Maeda Y (2015) Effects of small-group learning on transfer: a meta-analysis. Educ Psychol Rev 27(1):79–102. https://doi.org/10.1007/s10648-014-9260-8

Passolunghi MC, Caviola S, De Agostini R et al. (2016) Mathematics anxiety, working memory, and mathematics performance in secondary-school children. Front Psychol 7:42. https://doi.org/10.3389/fpsyg.2016.00042

Pifarré M (2019) Using interactive technologies to promote a dialogic space for creating collaboratively: a study in secondary education. Think Skills Creat 32:1–16. https://doi.org/10.1016/j.tsc.2019.01.004

Ramirez C, Schau C, Emmioglu E (2012) The importance of attitudes in statistics education. Stat Educ Res J 11(2) https://iaseweb.org/documents/SERJ/SERJ11(2)_Ramirez.pdf

Rao VNV, Legacy C, Zieffler A et al. (2023) Designing a sequence of activities to build reasoning about data and visualization. Teach Stat 45(S1):S80–S92. https://doi.org/10.1111/test.12341

Ridgway J, Nicholson J, Campos P et al (2017) Tools for visualising data: a review. In Molnar A (ed) Teaching statistics in a data rich world. Proceedings of the satellite conference of the International Association for Statistical Education (IASE)

Santos LM (2016) La resolución de problemas matemáticos y el uso coordinado de tecnologías digitales [Mathematical problems resolution and the coordinated use of digital technologies]. In: Angel R (ed) Cuadernos de Investigación y Formación en Educación Matemática. Universidad de Costa Rica, Costa Rica, pp. 333–346

Savelsbergh ER, Prins GT, Rietbergen C et al. (2016) Effects of innovative science and mathematics teaching on student attitudes and achievement: a meta-analytic study. Educ Res Rev 19:158–172. https://doi.org/10.1016/j.edurev.2016.07.003

Silva ODLD, Sousa Á (2020) Effects of life satisfaction on students’ attitudes towards statistics and technology and their interrelationships. In: 13th International Conference of Education, Research and Innovation (ICERI2020). IATED Academy, pp. 4994–5002

Siswono TYE, Hartono S, Kohar AW (2018) Effectiveness of project based learning in statistics for lower secondary schools. Eurasia J Educ Res 18(75):197–212. https://doi.org/10.14689/ejer.2018.75.11

Sutherland S, Ridgway J (2017) Interactive visualisations and statistical literacy. Stat Educ Res J 16(1):26–30. https://doi.org/10.52041/serj.v16i1.210

Szczygieł M, Pieronkiewicz B (2021) Exploring the nature of math anxiety in young children: intensity, prevalence, reasons. Math Think Learn 24(3):248–266. https://doi.org/10.1080/10986065.2021.1882363

Torrecilla MS (2018) Flipped Classroom: Un modelo pedagógico eficaz en el aprendizaje de Science [Flipped classroom: an effective pedagogical model in Science learning]. Rev IberoamEduc 76(1):9–22. https://doi.org/10.35362/rie7612969

Tuohilampi L (2016) Contextualizing mathematics related affect: significance of students’ individual and social level affect in Finland and Chile. REDIMAT 5(1):7–27. https://doi.org/10.4471/redimat.2016.1823

Wild CJ, Pfannkuch M (1999) Statistical thinking in empirical inquiry. Int Stat Rev 67(3):223–248

Wild CJ, Utts JM, Horton NJ (2011) What is statistics? In: Ben-Zvi D, Makar K, Garfield J (eds) International handbook of research in statistics education. Springer international handbooks of education. Springer International Publishing, pp 5–36

Williams AS (2015) Statistics anxiety and worry: the roles of worry beliefs, negative problem orientation, and cognitive avoidance. Stat Educ Res J 14(2):53–75. http://iase-web.org/documents/SERJ/SERJ14(2)_Williams.pdf

Woodard V, Lee H, Woodard R (2020) Writing assignments to assess statistical thinking. J Stat Educ 28(1):32–44. https://doi.org/10.1080/10691898.2019.1696257

Zotou M, Tambouris E, Tarabanis K (2020) Data-driven problem based learning: enhancing problem based learning with learning analytics. Educ Technol Res Dev 68(6):3393–3424. https://doi.org/10.1007/s11423-020-09828-8

Download references

Acknowledgements

This paper has been funded by the Strategic Partnership for the Innovative Application of Data Analytics in Schools (SPIDAS) project, European Union’s Erasmus+, under Grant 2017-1-UK01-KA201-036520. Furthermore, the paper has been partially funded by the Spanish Ministry of Science and Innovation under Grant PDC2022-133203-I00. All views expressed are those of the authors, not the European Commission or the Spanish Ministry. Finally, the authors would like to thank the teachers and the pupils of the schools Claver Raïmat Jesuïtes-Lleida and Maristes Montserrat-Lleida for their participation in the study reported in this paper.

Author information

Authors and affiliations.

Department of Psychology, Faculty of Education, Universitat de Lleida, Lleida, Spain

Andreea Cujba & Manoli Pifarré

You can also search for this author in PubMed   Google Scholar

Contributions

Manoli Pifarré, as the project’s principal investigator, led the conceptual background and the methodology of the paper. Both authors were equally involved in data collection, analysis, and manuscript writing.

Corresponding author

Correspondence to Manoli Pifarré .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Ethical approval

Approval was obtained from the University of Lleida ethics committee. The vice-rector of research and transfer signed the International Strategic Partnership for Innovative in Data Analytics in Schools (SPIDAS) ERASMUS+ project implementation in the schools that were partners of the project. The procedures used in this study adhere to the tenets of the Declaration of Helsinki.

Informed consent

The schools and teachers participating in this study were also partners in the SPIDAS ERASMUS+ project. The headteachers and the participating teachers were responsible for collecting signed informed consent forms from the parents of all students involved in the study. The signed informed consents were collected previously to the educative intervention. These forms explicitly requested permission from the families to use the data for research purposes.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/ .

Reprints and permissions

About this article

Cite this article.

Cujba, A., Pifarré, M. Enhancing students’ attitudes towards statistics through innovative technology-enhanced, collaborative, and data-driven project-based learning. Humanit Soc Sci Commun 11 , 1094 (2024). https://doi.org/10.1057/s41599-024-03469-5

Download citation

Received : 13 September 2023

Accepted : 15 July 2024

Published : 27 August 2024

DOI : https://doi.org/10.1057/s41599-024-03469-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

how to do design of experiments in excel

IMAGES

  1. How to Perform Design of Experiments in a DOE Template in Excel

    how to do design of experiments in excel

  2. Design of Experiments in Excel

    how to do design of experiments in excel

  3. Experimental Design and SPC for Excel

    how to do design of experiments in excel

  4. Design Of Experiments Using Excel

    how to do design of experiments in excel

  5. Design Of Experiment Template In Excel

    how to do design of experiments in excel

  6. How to Video on Design of Experiments

    how to do design of experiments in excel

COMMENTS

  1. Design of Experiments

    Completely Randomized Design (CRD) Randomized Complete Block Design (RCBD) Split-Plot Design. Latin Squares Design. 2^k Factorial Design. Taguchi DOE. Tutorial on Design of Experiments (RCBD, Split-Plot, Latin Squares, 2^k Factorial) and how to analyze these designs in Excel. Examples & software are included.

  2. Design of Experiments (DOE) with Excel

    Conclusions. Design of Experiments (DOE) is a powerful methodology, for process improvement. It enables the identification of critical process factors, based on data, rather than impressions. We can estimate the optimal values of these critical factors, to optimize the process. Excel provides some useful Data Analysis tools, to achieve this.

  3. Design of Experiments with Excel

    Design of Experiments (DOE) is a very useful process improvement methodology. Microsoft Excel has some powerful data analysis tools, which I have, successful...

  4. Design of Experiments (DOE) in Excel

    Use Excel's worksheet to organize your experiment by listing factors, levels, and experimental runs. Conduct Experiments. Perform the experiments according to your defined design. Record the data in Excel, with each row representing an experimental run and each column corresponding to a factor or the response variable. Perform Preliminary ...

  5. How to Perform Design of Experiments in a DOE Template in Excel

    Without DOE, you're stuck with the world's slowest method for success-trial and error.With Design of Experiments, you just have to test at the high (+) and l...

  6. Design of Experiments in Excel

    Design of Experiments can help you shorten the time and effort required to discover the optimal conditions to produce Six Sigma quality in your delivered product or service. Don't let the +/- arrays baffle you. Just pick 2, 3, or 4 factors, pick sensible high/low values, and design a set of experiments to determine which factors and settings ...

  7. How to Perform a Design of Experiments (DOE)

    Ingredients: 2 vs. 3 cups of flour. Oven temperature: 325 vs. 375 degrees. Cooking Time: 30 vs. 45 minutes. Let's say that you'll rank each resulting cake on a 1-10 scale for overall quality. You then use the +/- values in the orthogonal array to perform a test of every combination (16 total): High: all high values (+ + + +) = square pan, 3 ...

  8. Response Surface Design Of Experiments with Excel

    Design of Experiments is a useful methodology for process improvement. The purpose is to find a relationship between process variables we can control and key...

  9. What Is Design of Experiments (DOE)?

    The experimental data can be plotted in a 3D bar chart. Design of Experiments: 3D Bar Chart. The effect of each factor can be plotted in a Pareto chart. Design of Experiments: Pareto Chart. The negative effect of the interaction is most easily seen when the pressure is set to 50 psi and Temperature is set to 100 degrees.

  10. Design of Experiments

    The following example shows how to use Excel's Equation Solver and SigmaXL's Multiple Regression in conjunction with a DOE template. Caution: If you unprotect the worksheet, do not change the worksheet title (e.g. Three-Factor, Two-Level, 8-Run, Full-Factorial Design of Experiments). This title is used by the Main Effects & Interaction ...

  11. XLS Excellence Through Quality

    Learn how to use a 3-factor DOE template (Excel) from ASQ, a leading source of quality improvement tools and resources.

  12. Most Practical DOE Explained (with Template)

    For purposes of learning, using, or teaching design of experiments (DOE), one can argue that an eight run array is the most practical and universally applicable array that can be chosen. There are several forms of and names given to the various types of these eight run arrays (e.g., 2^3 Full Factorial, Taguchi L8, 2^4-1 Half Fraction, Plackett-Burman 8-run, etc.), but they are all very similar.

  13. 9. Manual Analysis Using MS Excel 2K Experiments

    Manual Analysis Using MS Excel. While the formulas provide many insights into the 2K factorial design of experiments, the use of MS Excel provides much easier alternative way to calculate the contrasts, effects, estimates, sum of squares and the development of the ANOVA table is shown in Video 6. Contrast, Effect, Estimate, and Sum of Square ...

  14. Experimental Design (DOE) with SPC for Excel Software

    Our SPC for Excel provides an easy-to-use design of experiments (DOE) methodology in the Excel environment you know. The software contains two-level full factorial designs (up to 7 factors), fractional factorial designs (29 different designs, up to 15 factors), and Plackett-Burman designs (up to 27 factors). The output includes the ANOVA table ...

  15. DOE (Design of Experiments) Help

    The experimental design module for SPC for Excel contains the following two-level experimental designs: To access measurement systems analysis, select "DOE" in the "Analysis" panel in the SPC for Excel ribbon. The following form is displayed. The first page lists the designs that are available in the SPC for Excel software .

  16. The Easiest Way to Randomize an Experiment in DOE using MS Excel

    http://www.theopeneducator.com/https://www.youtube.com/theopeneducatorModule 0. Introduction to Design of Experiments1. What is Design of Experiments DOE? 2....

  17. Randomized Complete Block Design

    Here we press Crtl-m, choose the Analysis of Variance option and then select the Randomized Complete Block Anova option. You now fill in the dialog box that appears as shown in Figure 4. Figure 4 - RCBD data analysis tool dialog box. The output shown in Figure 5 is very similar to that shown in Figure 3. Figure 5 - Randomized Complete Block ...

  18. DOE Software for Excel

    Each template contains an "orthogonal array" of the combinations of high and low values to be used in each trial. Conduct your experiments and then drop your data into the yellow shaded input areas. The templates perform the calculations and draw charts and interactions for analysis. It is recommended to have a minimum of (8) Replications.

  19. Design of Experiments

    Design of Experiments 4. Design of Experiments. Mixture design in Excel tutorial. Surface response design in Excel tutorial. Factor effect (screening) design in Excel tutorial. Taguchi design in Excel tutorial. Expert Software for Better Insights, Research, and Outcomes. English. Products.

  20. Taguchi Design of Experiments

    One of the goals of the Taguchi approach is to optimize the early stages of product or system design, especially by reducing the number of experiments or trials that need to be conducted. Here the goal is to extract as much information as possible for the least cost (in terms of time, resources, and money). The approach uses a small number of ...

  21. The Open Educator

    Plackett-Burman Fractional Factorial Design Using MS Excel. Plackett Burman Fractional Factorial Design of Experiments DOE Using Minitab. Part III. Optimize the Important Factors/Variables. ... Statistical methods, experimental design, and scientific inference. Oxford University Press. ISBN-10: 0198522290; ISBN-13: 978-0198522294.

  22. DoE 21: Solving a Two Factor Factorial Design Using Excel

    This video is part of the course "Design and Analysis of Experiments"https://statdoe.com/doeTutorial on how to solve a two-factor factorial design using MS E...

  23. Enhancing students' attitudes towards statistics through innovative

    Specifically, this paper reports on the design, implementation, and evaluation of innovative technology-enhanced, collaborative, and data-driven project-based learning, aiming to positively impact ...

  24. Design any Fractional Factorial DOE with the Lowest Number of ...

    The MS Excel file is here: https://www.theopeneducator.com/doe/Fractional-Factorial-Design-of-Experiments/Lowest-Runshttp://www.theopeneducator.com/https://w...