The document provides guidance on conducting a survey for research purposes. It discusses selecting a problem and establishing its significance, conducting a literature review, choosing participants and sampling methods, selecting appropriate data collection instruments such as questionnaires, interviews and observations, and considerations for validity and reliability. The steps include defining the research problem, reviewing related literature, identifying participants, developing and testing measuring instruments, and executing the data collection procedure.
1 of 43
More Related Content
how to conduct a survey research sem 1 session 20152016
2. Selecting and defining a problem
What you intend to do?
Why you intend to do?
The problem statement should be written
clearly
3. Significance Of The Study
The importance of the topic be investigated
Can the findings be applied to the
practitioner?
Will the findings considered necessary to
advance the pool of education knowledge in
general?
4. Literature Review
The relatedness between problem statement
and theory (psychology, sociology, etc, etc)
1 theory? 2 theories? 3 theories? ? ? ? ? ?
Related? Relevant? Focus?
7. Selecting participants
Identify the respondents who have the information
you need
Who has access to the information
people who experienced an event
The characteristics of the people who have experienced an event
Very young children It may be necessary to ask parents or
responsible adults for information
The best source of information
Depends on the type of information needed
Eg parents, school records, teachers etc
The reading behavior of five-year-old children
STAIL KEPIMPINAN GURU BESAR
8. Selecting participants
Population and Sampling
Define the population under study.
Once the general nature of the respondents has
been identified, researcher has to be more
specific about the information resources.
The conceptual definition of the population has to
be translated into operational terms.
Determine sample size
Choose samples
9. Selecting participants
Population and Sampling
Choose relevant sampling procedures
If a sample is well selected, the result of a study
testing the sample should be generalizable to the
population (Gay, Mills, Airasian; 2009)
12. Selecting measuring instruments
Gay, Mills & Airasian (2009):
Questionnaire
A written collection of survey questions to be answered by
a selected group of research participants
Interview
An oral, in-person question-and-answer session between a
researcher and an individual respondent.
Observation (Wiseman, 1999)
Is a process by which desired information is obtained by
observing or video recording, the occurrence or
nonoccurrence of defined behavior.
Eg researcher desires to evaluate student behavior in the
classroom or in specific activity
15. Selecting measuring instruments
Likert
Read the statement and circle whether you strongly
disagree (1), disagree (2), uncertain (3), agree (4), or
strongly agree (5)
Read the statement and circle whether you strongly
disagree (SD), disagree (D), uncertain (U), agree (A), or
strongly agree (SA)
Item Strongly
Disagree
Disagree Uncertain Agree Strongly
Agree
My science teacher is friendly. 1 2 3 4 5
My science teacher is friendly. SD D U A SA
16. Selecting measuring instruments
Free Response Item
Write about your science teacher.
__________________________________________
__________________________________________
__________________________________________
__________________________________________
__________________________________________
__________________________________________
17. Selecting measuring instruments
GUIDELINES
Collect
demographic
information of the
samples
Include only items
related to the
objective of the
study
Questionnaires
should be brief
Items should be
uni-dimensional
Avoid terms that
may give different
meaning to
different people
Avoid question
that assumes a
fact not
necessarily true
Write directions
for respondents
Pilot testing the
questionnaire
23. Measuring Instrument
Construct
Cant be observed directly
A concept invented to explain behavior
To be measurable, constructs must be
operationally defined
Attitude towards teaching profession
24. Selecting measuring instruments
GUIDELINES
Short questions are easier to understand
Avoid trick words
Respondents are much more likely to
complete a short questionnaire
Questionnaires
should be brief
26. Selecting measuring instruments
GUIDELINES
Quantify responses whenever possible.
The words sometimes, often, usually have
different meanings for different people
Use word such as daily, once a week, etc
Avoid terms that
may give different
meaning to
different people
27. Selecting measuring instruments
GUIDELINES
How much money you spend for food in the canteen
everyday? (X)
Do you buy foods from the canteen?
If so, how much money you spend for food in the
canteen everyday?
How many times do you bring your students to the
computer laboratory per week?
Avoid question that
assumes a fact not
necessarily true
28. Selecting measuring instruments
GUIDELINES
Direction: what to do? How to respond?
Circle the choice that you most agree with
Rank you choices from 1 to 4
Tick your answer
What is it about?
Write directions
for respondents
30. Selecting measuring instruments
GUIDELINES
Try out with a small sample similar to
intended group of respondents
Validity of the instrument
Reliability of the instrument
Pilot testing the
questionnaire
31. Measuring Instrument
Achievement test
Paper-and-pencil test
Performance test
Aptitude test/Intelligence test
To measure skills and knowledge in specific areas
Attitude scale
Attitudes of individuals/groups are of great
interest of educational researchers
32. Measuring Instrument
Standardized instrument
Selecting a standardized instrument takes less
time than developing ones own instrument
Translation
Self-developed instrument
Standardized instrument may not suitable for the
specific objectives of a research
33. Validity
Measures what it supposed to measure
Abstract variables Vs physical science
Anxiety, motivation, attitudes Vs length, volume, weight
Categories of validity
Concurrent validity
Construct validity
Content validity
Predictive validity
34. Validity
Concurrent validity
The degree to which performance on one test relates
to performance on a previously validated test.
0.0 0.29 low correlation
0.30 0.69 medium correlation
0.70 1.00 high correlation
35. Validity
Construct validity
Refers to the degree to which an instrument of
assessment measures a trait that is not directly
observable
Does not yield a correlation coefficient
Judgments
36. Validity
Content validity
The extent to which experts believe that the
instrument addresses the research objectives.
Based upon judgments
Eg. Ujian pencapaian
37. Validity
Predictive validity
The degree to which estimated performance becomes
reality
Scholastic
Aptitude Test
CGPA
38. Reliability
Measures whatever it is measuring
consistently
A measuring instrument can be reliable
without being valid
To be reliable, a measuring instrument must
be valid
40. Reliability Coefficients
Test-retest coefficients
Coefficient of stability
Consistency of subjects scores over time
Administer a test to the same group of individuals
on two occasions
Correlate the paired scores, r
41. Reliability Coefficients
Alternate forms coefficients
Coefficient of equivalence
Consistency of subjects scores on two equivalent
tests
Administer two tests (a test and its equivalent) to
the same group of individuals
Correlate the paired scores, r
42. Reliability Coefficients
Internal-consistency coefficients
To determine whether all the items in a test are
measuring the same thing
Administer a test to a group of individuals
SPSS:
Analyze Scale Reliability Analysis Alpha, 留
Minimum reliability?
43. Executing research procedure (+ data
collection)
Directly administered questionnaires
Questionnaire is administered to a group of people at a certain
place
Where & When?
Mailed Questionnaires
Cover letter (mail/e-mail)
Brief & neat
Explain the purpose of the study
Include your contact number , mailing address and e-mail
address
Self-addressed stamped envelope (mail/e-mail)