Survey research and design in psychology/Assessment/Lab report/Feedback/2014
General feedback about the lab report (2014)
|
Marking distribution
editDescriptive statistics:
- Mean 63.92
- Median 66.40
- SD 17.64
- Min 1.42
- Max 91.50
- N 88
Grade breakdown:
- HD 9%
- DI 20%
- CR 26%
- P 31%
- F 14%
Title
edit- The title should clearly and unambiguously communicate the main content of the report (Weaker titles tended to be more vague and lacked reference to the main variables analysed and their relationship(s).)
- Longer titles generally provide more appropriate detail than shorter titles (try to mention the key variables or questions).
- Some reports didn't provide an APA style title page.
Abstract
edit- Typically too much focus on the Intro/Method and not enough on the Results and Discussion.
- What were the labels of the identified factors?
- Strength and direction of results were often not indicated.
- Often there was no mention of implications or recommendations.
- Statistical results (i.e., with symbols and numbers) should not generally be reported in the abstract unless they are particularly pertinent (e.g., a notable effect size).
- References should not be reported in the abstract unless they are particularly pertinent (e.g., to draw attention to a key theory which is being tested).
Introduction
edit- The single, major issue/criteria is whether the introduction provide a review of literature which leads directly to a clearly expressed, logically-derived research question and hypotheses?
- For example, weaker introductions may well have reviewed some literature but this wasn't necessarily related directly to justifying each of the hypotheses.
- When stating the research question, try to integrate it into a sentence, rather than posing it as a stand-alone question.
- The MLR hypotheses should include predicted direction of the relationship (where applicable).
Method
editParticipants
edit- Often only basic profiles of the sample were provided (e.g., N, and n and percentage of males and females, with the average age (and SD and range; what about the median?). Such descriptions were graded as P-level.
- Better sections provided more thoughtful description of the sample (e.g,. more description of the cultural context (so that an outside/naive reader can better understand the sample) and/or comparing with known statistics about the university poulation).
- Further description of the sample could have been provided by using other demographic information (e.g., enrolment status, living status, completion).
- Some reports included procedural information in the participants section (e.g. how the participants were recruited). The participants section is for describing the sample, not for providing details of data collection.
Measures
edit- Generally the instrumentation purpose, development and structure were well explained.
- Weaker sections tended to lack sufficient description of the proposed factors (e.g., a table summarising the proposed factor names, definitions, with example items).
- Better sections tended to place more emphasis on the measurement of variables used in the Results than on other aspects of the survey.
Procedure
edit- What kind of sampling technique was used? (Hint: It was not random - this was a common mistake - it was convenience sampling, with systematic selection.)
- Make sure to provide a citation and reference to the administration guidelines (otherwise, how can someone replicate the study?).
- Better sections tended to provide more detail about how the administration process proceeded (e.g., response rate, reasons for refusal, anomalies)
Results
editGeneral
edit- There is no need to mention what software was used for well-known and commonly available data analysis techniques, such as being used in this study.
- Avoid referring to SPSS variable names - these are arbitrary. Refer instead to the construct, capitalised e.g., Stress or University Student Satisfaction
- Round values to 2 decimal places, except when reporting p values where 3 decimals are recommended.
Data screening
edit- Weaker reports tended to provide relatively little detail about data screening.
Factor analysis
edit- Some reports seemed to remove items before deciding on the number of factors - decide # of factors first, then consider which items to remove
- Some reports did not provide a table of factor loadings sorted in descending order, with communalities
- Most reports provided Cronbach's alpha for each factor, but many did not comment on the acceptability of these values.
Multiple linear regression
edit- Some reports did not provide a table of correlations and regression coefficients, along with R2
- Report on the individual predictors in-text, as well as the overall model.
- It is important that an understanding of the direction, size, and significance of MLR relationships are shown for each of the predictors.
Discussion
edit- Stronger discussions showed an understanding of the directions and strength of relations between constructs.
- Many reports only interpreted results very briefly, yet limitations were explored in great detail. While limitations are an important part of this section, the primary purpose of the discussion is to demonstrate an understanding of the results and what they mean in relation to previous research.
- A common problem with recommendations and conclusions was their lack of specificity (vagueness).
References
editAppendices
edit- A reader should not have to consult an Appendix to understand the report. Appendices are an optional adjunct (e.g., they aren't used much in journal articles). Any content specifically related to the marking criteria should be presented in the main body.
Common problems were:
- Formatting of title page
- Formatting of headings: See: http://blog.apastyle.org/apastyle/2009/07/five-essential-tips-for-apa-style-headings.html
- Use of running head
- Do not include the heading for "Introduction" but do repeat the title before the introduction
- Where more than twp references are cited consecutively, ensure they are in alphabetical order.
- Numbers under 10 which are used in sentences should be written in words.
- Use Australian spelling (e.g., hypothesise instead of hypothesize)
- Start sentence with words not numerals, e.g., use "Seventy-five people...." rather than "75 people...".
- In sentences, use words rather than symbols e.g., "<= 21" should be written as "less than or equal to 21". If used within brackets, symbols should be used.
- Symbols such as equals (=) represent/replace words, therefore they should have a space before and after.
- Statistical symbols which use English letters (such as M) should be italicised.
- Write in the third person perspective (i.e., do not use I, we, you, our, etc.).
Formatting
edit- Paragraph indents
- Use page breaks rather than multiple blank lines to separate content onto new pages.
Tables
edit- Unedited (default) output from statistical software (for tables and figures) is not acceptable as APA style.
- Right align statistics in tables.
- Two decimal points are generally sufficient - we don't learn much from the third.
- Centre tables horizontally on the page.
- Using the Table feature of the word processing software is recommended because one cell per unit of information allows more powerful manipulation and formatting of columns and rows.
Capitalisation
edit- Measured constructs should be referred to as proper names, i.e., with first-letter of the words capitalisation. This is mostly for the Method and Results and parts of the Discussion. In the Introduction and parts of the Discussion, where the more general concept (not the operationalised measure) is being used, this should be capitalised.
Written expression
edit- Avoid one-sentence paragraphs (try three to five sentences).
- Avoid overly-long paragraphs (convey one key idea per paragraph).