Introduction

“The purpose of education is to replace an empty mind with an open one.” This famous quote by Malcolm S. Forbes cannot hold truer when considering the ever changing, rapidly evolving modern world. Perhaps today, the ability to adapt and keep up with the times has become more important than the level of conceptual knowledge one possesses. Suleman (2017) describes how higher educational institutions are under a lot of pressure to prepare graduates for the world of work, with generally agreed upon expectations that they will have employability skills which include relational, technical and cognitive skills. This shows that it is the transferable skills one learns, rather than the number of facts one knows or remembers, that make one professionally competent, a sentiment shared by Nagele et al. (2016) who talk about the crucial role of transferable skills when it comes to organisational recruitment and selection processes.

Nagele et al. (2016) discuss the difference between surface learning, which is focused on the reproduction of learnt material and deep learning, which is more focused on the merging of theory and practice and on the actual understanding, rather than the memorisation, of material. Learning is done not only with the head, but also with the heart and hands (Reynolds et al., 2012 as cited in Nagele et al., 2016). Deep learning promotes the transferability of skills as it encourages critical reflection and inquiry about the material or task being learnt and how it relates to a particular overarching subject or goal (Nagele et al., 2016).

The World Economic’s Forum has also recently increased their own projection of the importance of creativity when it comes to employment (Wilson et al., 2017). However, it is widely known that there are often tensions between the current educational system and the development and realisation of learners’ personal creativity (Wilson et al., 2017). By being too fixed on “learning outcomes” and on maintaining an academic tradition, creativity may at

times be compromised in education. This is also seen in the direct tension there is between formal conventional student assessments that tend to be used in universities and creativity, with regulatory processes of assessment promoting risk-averse learning approaches (Wilson et al., 2017) that may hinder rather than allow for creative expression in assessments. Although it is recognised that creativity and the nonconformity that comes with it is not suitable for all professions (Wilson et al., 2017), it is certainly valuable in the humanities. Papaleontiou-Louca et al. (2014) talk about ways higher education can promote creativity in students, such as fostering curiosity and challenging assumptions. Zawacki-Richter (2020) also talks about the concept of open education, the core of which is its openness that allows for transparency, collaboration and participation by all students of varied abilities and needs, while also allowing for traditional subjects to be taught in more varied ways.

The aims of this social policy paper are the following:

  • ●  to advocate for the improvement of the Bachelor of Psychology (Hons) andrelated courses through a value-focused shift towards increased creativity,critical thinking and flexibility in modes of examination and learning,
  • ●  to provide realistic suggestions and practical changes for what an improvedcourse would look like within the Maltese context that will benefit bothstudents and lecturers, and applying this to other courses if applicable.
  • ●  to use these suggestions to decrease the barrier between the theory of thecourse and the practice of skills in working with people.
    These aims will be fulfilled through getting the opinions and feedback directly from psychology students, lecturers, and graduates in the field.

Literature Review

As time has progressed, so has students’ thinking. Shavelson et al. (2018) speaks about ‘21st century skills’ needed to enter the workforce, these include critical thinking, problem solving, perspective taking, and communication skills. Nowadays, undergraduates need a more modern approach for learning, for example by incorporating the use of appropriate technology, networking and small group activities with the traditional way of learning.

Assessment is a core component of education. It is a way of communicating between the educational world and the societal world (Shavelson et al., 2018). The way of thinking, habits, technology, and politics of a particular age and time, shape the assessment practices that are implemented in schools, colleges, and universities, as well as workplaces and less formal learning environments. Students may interpret all assessments as summative and devalue and/or resist their involvement in them (Shavelson et al., 2018), thus students can only succeed if assessments allow them to express themselves fluently. This necessitates the recognition of multimodal expressions, which include a variety of combinations of oral, verbal, and craft modes (Vincent, 2006).

According to Madaus et al. (1986), a significant portion of the well-known assessment technology apparatus of today is the result of the modernist assumptions and educational requirements of the nineteenth century. Isaacs (2010) says that there has been a commitment to believe in the power of numbers, grades and targets to deliver quality, accountability, equality, and defensibility over the past few decades, making us an “assessment society”. This development can be attributed to globalisation and a work-oriented society.

Broadfoot et al. (1998) states that the approaches’ underlying assumptions can be identified as: (a) that it is right, ‘objectively’ to seek to identify relative levels of student performance as the basis for educational selection, (b) that it is possible to undertake such

identification with a sufficient degree of ‘objectivity’ that it provides a broadly fair outcome for the candidates affected, (c) that the quality of such assessment is embodied in notions of reliability and validity, (d) that students’ scores on national examinations and tests provide a valid indicator of the quality of institutional performance, (e) that it is possible usefully to compare the ‘productivity’ of individual education systems through international comparisons.

A comprehensive cost-benefit analysis of current assessment practices is likely to reveal a significant mismatch between the principles on which they are largely based and the ability of available techniques to meet these principles adequately. It could enunciate a distinct set of assessment principles based on educational rather than measurement priorities, which could in turn call into question the legitimacy of multiple of the assessment methods that are currently in use. In particular, Broadfoot and Black (2004) suggest the following questions that need urgent attention: How many of the current methods of assessing tend to reinforce outdated notions of curriculum content and student learning at the expense of modern learning skills and dispositions like creativity and learning to learn? Is it time for a new assessment paradigm based on the very different needs and epistemologies of the twenty-first century?

Criteria for assessments should be clear about what the desired learning outcomes are so that students would be able to make use of various forms to meet them. If we take a look at traditional assessment formats such as essay writing, the main purpose is for students to display that they can come up with a well-structured critical argument. These components can also be displayed using a different form of assessment such as through a video presentation. According to Hall (1982), these alternative formats can prove to be more suitable and less challenging for students to show what they have learnt. This could particularly be helpful for

students with disabilities (Konur, 2006) who might otherwise struggle with the traditional modes of assessment.

Sambell et al. (1997) found that making use of flexible assessment styles might be beneficial for students to prepare for life after university. This is because they can choose an assessment format that has transferability for their careers. This suggests that having the element of choice in assessments could potentially increase motivation and learning. A choice between an essay and an online test is unlikely to give every student the chance to demonstrate their learning in their preferred way (Hall, 1982). Instead, the author found that student preference varies widely depending on their individual learning strategies and creativity in demonstrating their ability in the subject.

Some research indicates that some assessments, which are the baseline for most universities such as essays and exams, often gauges student’s ability in responding to a specific type of test and how well they can adapt to its limitations, such as allotted time, rather than their overall learning. For instance, because reading a large volume of text is required from students, traditional evaluation formats like essay writing might be challenging for some students, especially those with disabilities (Hanafin et al., 2007). Hanafin et al. (2007) discovered that even when students were given more time because as part of access arrangements, the volume of text can prohibit some students from adequately demonstrating their learning. He also concluded that unseen exams can disadvantage students with disabilities due to having to deal with text in a stressful situation with a time limit. Sambell et al. (1997) add to this by bringing up the fact that students believe that unseen tests test their memory, stress-management abilities, and luck more so than their ability to display what they have learned.

According to educators engaged in evaluation practices, assessment has a significant impact on students’ learning. The approach that a student adopts towards assignments and

evaluation tasks is influenced by their perception of learning and studying (Struyven et al., 2005). Additionally, the manner in which a student experiences evaluation and assessment can shape their approach to future learning, making assessment not only logically but also empirically a crucial characteristic that defines a student’s approach to learning. The following approaches aim to understand the connections between the perceived properties of assessment and the approaches students take towards learning and studying.

Struyven et al. (2005) mentions three primary learning approaches are recognized concerning students’ learning perceptions. Firstly, the surface approaches to learning, which refer to a desire to finish the learning assignment with minimal involvement and perceiving it as an unwanted external obligation. These are often linked to repetitive and unthoughtful memorization and technical issue-solving, leading to limited conceptual comprehension as an expected result. Contrastingly, deep approaches to learning start with an intention to comprehend, and involve active conceptual analysis, often leading to a deep level of understanding, and if pursued rigorously, is typically linked to high-quality learning outcomes. Lastly, the strategic or achieving approach; this category was introduced due to the widespread evidence indicating the impact of assessment on learning and studying. In this scenario, the student aimed to obtain the best possible grades through diligent study methods, efficient time management, and a well-structured approach.

These approaches should be viewed as dynamic and continuously adapted according to the context the learner is facing, rather than interpreted as fixed concepts (Struyven et al., 2005). Nonetheless, the modifications in such approaches are frequently subtle and go unnoticed.

Based on the reviewed studies, it was found that there is a strong connection between students’ perceptions of assessment and their approaches to learning. The perceived attributes of assessment appear to significantly influence students’ approaches, and conversely, students’ approaches can also affect their perceptions of assessment. Such perceptions can have either positive or negative influences, and those perceived as inappropriate are more likely to promote surface approaches to learning. This implies that inducing a surface approach to learning is effortless, whereas fostering the deep approach appears to be more challenging. Educators hold a significant impact on students’ learning approaches (Struyven et al., 2005), but are not effectively providing them with appropriate guidance for optimal learning. Additional research is necessary to determine the underlying causes of this issue.

Methodology

For the survey given to students, the data was gathered through a Google Form which included a total of 25 questions (see Appendix A). The first part included demographics, where they were asked their age, gender, year of beginning of course, and which course they are/were following most recently; Bachelor of Psychology (Hons), Bachelor of Arts in Psychology or Higher Diploma in Psychology. The criteria for graduates was to have

graduated from the aforementioned courses within the last 5 years. It was followed by questions about teaching and lecture styles, methods of assessment, preparation of workplace and personal development, change at the university, and experience with studying abroad. The form was shared along with a poster (see Appendix B) on Betapsi’s Facebook and Instagram pages where people could opt-in and consent to fill it. Another poster (see Appendix C) was printed and hung up around campus with a QR code for students to scan and fill in the survey. Additionally, emails were sent to Betapsi members and through the University of Malta Registrar (see Appendix D). Descriptive statistics was used to analyse the data.

For the survey given to lecturing staff, the data was gathered through a Google Form which included a total of 6 questions asking participants on their opinion on assessment methods, quality control measures and changes they would like to see within university (see Appendix E). This survey was shared through the Head of Department (Dr. Gottfried Catania), who kindly shared it with the lecturing staff in the department of Psychology (see Appendix F), and through individual emails sent to full-time and part-time lecturers within the department. Data was analysed by drawing out common themes and comparing it to students’ responses.

Results and Discussion

Demographics

A total of 96 student responses were recorded, of which 76 (79.2%) are/were enrolled in Bachelor of Psychology (Hons.), 14 (14.6%) are/were enrolled in Bachelor of Arts in Psychology with another area, and 6 (6.3%) are/were enrolled in Higher Diploma in Psychology. Of these respondents, the majority were enrolled between 2019 and 2021 (81.1%), meaning they are currently students, or have graduated within the last year. The remaining percentage enrolled between 2014 and 2018. With regards to age, the median age was 21, and the majority of respondents were female (80.2%).

A total of 13 responses from the lecturing staff was recorded.

Lecturing Style

The majority of respondents stated that the lecture style that they learn most through is that of lecture with student engagement (62.1%), followed by tutorials (47.4%) and practical methods (45.3%). When it comes to whether students think that the method of teaching is inclusive to different needs, 48.9% said it is not and 23.2% were unsure. 

Methods of Assessment

Versatility of Assessment Methods and Accordance to Study Unit Learning Objectives

With regards to methods of assessment, 53.7% of students said that assessments do not accurately reflect the study unit learning objectives, 30.5% said they do, while the remaining 15.8% were unsure. The lecturing staff brought up a number of measures taken to ensure consistency among different units. These can take the form of entities, including the Academic Programmes Quality & Resources Unit (APQRU) that reviews the planning of modes of assessments of study units, as well as a Programme Validation Committee. One respondent also mentioned the university’s Quality Assurance unit, which is currently trying to implement this consistency. However, the respondent also highlighted that this was a work in progress and that not all lecturers check that study-unit contents and assessment address the description as published. At times, there are also external examiners present, which require that local examiners hand in a report. Nonetheless, another respondent highlighted the importance of the lecturing staff being able to maintain flexibility and have their expertise and teaching style be respected.

In addition, there are also set guidelines and documents that lecturers need to follow. The responses also highlight that the department regularly implements quality control measures and holds sessions with the Board of Examiners. Moreover, one response stated that the department follows up on student feedback each semester, though the number of students who complete the feedback forms is quite low and thus the department has limited feedback to consider. From the students’ side, the majority of respondents (56.8%) said that they do not feel that they can write their opinion in an exam question without fear of negative repercussions, whilst 22.1% were unsure, and 21.1% said they could. This may be tied to the low student response to the feedback forms. This is in line with what was said by Wilson et al. (2017), in that by focusing too much on learning objectives, creativity is compromised, which is seen as problematic for students. The majority of students also said that they had never gotten feedback in assessments (45.3%), and 24.2% said they had gotten it only once.

Alternative Assessment Methods

According to Hall (1982), alternative assessment methods may be more suitable methods for students to demonstrate their learning. They can also be more accessible to students with disabilities, who may otherwise struggle with traditional assessment methods (Konur, 2006). Alternative forms of assessment can help foster deep learning (Nagele et al., 2016), which is not merely reproducing what one learns.

The lecturing staff unanimously agreed over having alternative and non-traditional assessment methods, though they listed a number of provisions. Many of them pointed out that these should be utilised as long as they are reflective of the study-unit outcomes. In addition, the use of these types of methods is considered to be highly dependent on the number of students in the study-unit. When lecturers have a large number of students, particularly in the case of compulsory study-units, they reported feeling obliged to resort to traditional methods, especially multiple-choice questions (MCQs) based exams, because it would be very challenging to use other methods due to the large amount of time required to execute them.

Although the lecturing staff showed an openness to alternative assessment methods, they did not believe that more traditional methods should be discarded. Indeed, one of the responses highlighted that typical assignment is an important assessment method which allows students to study and research a topic in detail. Assignments can also serve as an opportunity for students to practise constructing critical arguments. This seems to be corroborated by student responses, with 85.3% of students listing assignments as one of their preferred methods of assessment, followed by Multiple Choice Questions (69.5%), and Reflective Journal (51.6%). As such, it can be seen that students would like creative expression in assessments, which is valuable in all streams, especially humanities (Wilson et al., 2017).

Element of Choice

The majority of students (92.6%) said that they would prefer having an element of choice in the way they are assessed. When having an element of choice, students may also be better prepared for life after university because they are able to choose an assessment format that has transferability for their future careers (Sambell et al., 1997).

Meanwhile, the lecturing staff’s responses highlight that while students should have the right of choice, this should be limited to the topic title of the assessment, essay, or activity, rather than the assessment method. The reason for this being that lecturers believe that students’ views are important, but it is important to maintain study-unit coherence and respect the academic’s expertise. It is also important to consider the nature of the material that is taught and assessed in the study-unit. Whereas assessments like multiple choice question-based examinations test breadth, other methods, such as assignments and essays test width. Hence, some lecturers suggested that certain assessment methods cannot always be implemented if they do not match the requirements of the study unit. Other responses highlighted that while this would be desirable, it would be difficult to achieve due to the large student numbers. Another issue is that there would need to be a sense of coherence in assessing students in the same way.

When lecturers were asked if they were allowed to implement the assessment method of their choice, most of the respondents stated that they felt able to do so, but up to a certain point, or only for certain study-units. First and foremost, the lecturers emphasised that the assessment method needs to reflect the outcomes of the study-unit and cannot deviate from the study-unit description. Furthermore, they mentioned that the assessment method has to be chosen from a list of possible assessment methods and any changes need to be planned two years ahead. Beyond formal regulations, the respondents highlighted that the large student body, lack of resources and adequate remuneration act as more practical limitations as to why they cannot implement alternative assessment methods. The most common restriction of these was the large number of students, with 38.5% of the lecturers mentioning it. Lastly, the majority of students said that they would prefer having their assessment percentage split up across the semester (73.7%). Understandably, the large number of students each year and the lack of teaching resources may hinder this from happening.

What is tested

Moreover, when asked what they think is tested during time-limited unseen exams, students replied that memory is tested the most (94.7%), followed by time-management (61.1%), stress management (63.2%), luck (54.7%), understanding (36.8%), and lastly effort (18.9%). This is in line with Sambell et al. (1997), who found that students believe that unseen tests test their memory, stress-management abilities, and luck more so than their ability to display what they have learned. The element of luck that students feel plays a role in testing can be interpreted as their ability to respond to a specific type of test, and the questions they entail, rather than showing off their learning, whilst also having to manage their stress.

The lecturing staff all believed that memory was an important factor in time-limited unseen exams (100%), closely followed by time-management and understanding (92.3%). The next factor in line was effort (76.9%). Less respondents believed that stress-management (53.8%) and luck (46.2%) were tested during time-limited, unseen exams.

The differences in perception of what is tested during time-limited unseen exams may hint at why this assessment method is a popular choice by assessors, even though it is not favoured by students.


Preparation for the Workplace and Personal Development

When it comes to preparation for the workplace and personal development, 49.5% of respondents said that they do not think that they have gained practical skills that they could make use of in the workplace. Moreover, 52.6% said that they have learnt skills that are transferable to other areas of study after the undergraduate degree. Most respondents (68.4%) said that they have gained research skills from the psychology course, followed by empathy (57.9%) and critical thinking (56.8%). In fact, Shavelson et al. (2018), emphasises how in today’s age, colleges are pressured to cater for 21st century skills, which are required for the workforce, most importantly critical thinking.

When it comes to group work, one lecturer remarked that they are concerned by students who prefer to work on their own during the study unit since group work is an important part of developing the collaborative skills necessary in the working world. On the other hand, one lecturer highlighted that group assignments can lead to social loafing, which can be frustrating for students.

When asked what practical skills they would like to learn that are not emphasised in the psychology course, most students mentioned that they would like to learn problem solving skills, which does not entail learning by heart. Moreover, there was also an emphasis on learning skills that reflect the working experience of a psychologist. It is interesting to note that some skills that the respondents thought they have gained in their course were also mentioned in this section. This may highlight a lack of consistency in the aims of certain aspects in the course. However, Suleman (2017) describes that higher institutions are under pressure to prepare graduates to the workforce, and such skills should be transferable (Nagele et al., 2016). Therefore, it is important to highlight that these students do not feel that they are learning practical skills that will prepare them for employment, which can hinder their professional development.


Change at University

When discussing change at university, 72.6% of respondents do not think that they have a voice at the University. There were a variety of responses to the question of what is liked in the course. These included an appreciation for those lecturers who truly dedicate their time on their study units, the vastness of the course where there is something to interest everyone and the fact that students can use what they learnt in their life experience. When asked what would make their experience better, many students expressed their wish for there to be a number of placement and practical options within the course. Additionally, they mentioned that the format of study units should be improved, such as giving notes and readings from before the lecture and adding to the relevancy of what is being learnt to the profession. Students also mentioned having more interaction between lecturing staff, and a better understanding of the student’s individual needs, namely when concerning access arrangements. This latter point is corroborated by one of the lecturer respondents, which argued for more flexibility in assessing students, especially those with learning disabilities.

As highlighted through the previous responses, the lecturing staff believed that the large student numbers exerted a significant amount of pressure upon them. They proposed numerous solutions to relieve some of the large workload. The most common one was having more human resources, with 61.5% of respondents bringing it up. More specifically, the lecturing staff who participated in the questionnaire mentioned having more lecturing staff, especially ones who work on a full-time basis. Two respondents also mentioned the possibility of having teaching assistants, who could be PhD students that would assist in tutorials and marking assignments. This would also give psychology graduates interested in academia more exposure. Other respondents also highlighted having more assistance in examinations, specifically with corrections and the typing out of examination papers over Wiseflow by administrative staff. Assistance with administrative issues, such as replying to student requests and emails, was also mentioned by other respondents.

Another theme that arose was a greater respect for academics. This included more respect towards their expertise, as well as towards visiting lecturers, with one respondent feeling that they do not feel acknowledged by the university powers, even in terms of having resources such as an office space and planning time. Other concerns mentioned were the integration of the technology in university. This included being aware of issues like Chat GPT and circumventing them through means such as oral exams and interactive fieldwork.

Experience with Studying Abroad

The final section was about experience with studying abroad, in which 17.9% of respondents said they did, 10 respondents studied in the UK, 3 in Italy, 2 in Germany, and 1 in Czech Republic, and the Philippines. All the respondents said that they observe a difference in the relationship between lecturers and students abroad and locally. When asked about the main differences between the system abroad and in Malta, a range of responses was given. Most students mentioned that they felt like they had a better relationship with the lecturers, which included more interaction in class, having feedback after assessment, and having lecture material organised well online. Furthermore, it was mentioned that even though the cohort was also large, some universities divided the cohort in groups to have better engagement.

Conclusion

The findings of this research show that the lecturers within the University of Malta psychology department who completed the survey are open towards the idea of using non-traditional assessment methods to assess students, while giving very valid arguments when it comes to maintaining the use of non-traditional assessment methods, ranging from practical limitations to appreciation of the value traditional assessment methods such as assignments still hold when it comes to assessing students. This openness shows that lecturers are themselves indeed open to being creative as academics but, as seen in their responses, are sometimes limited by the way study units are themselves structured and their specific outcomes. Moreover, they are limited by the need to maintain a certain structure and coherence, especially when it comes to study units with a high number of students.

Students who participated in the survey showed a range of responses, however the vast majority of students showed a preference for having a degree of choice in assessment. Interestingly also, are the student responses that contradicted lecturer’s opinions, such as what is tested during unseen exams. Most notably is that almost 3/4th of the respondents felt that they do not have a voice for change at the university, this should be investigated further to find out what is leading to this perception and what can be done to change it.

References

Broadfoot, P. (1998). Quality standards and control in higher education: What price life-long learning? International Studies in Sociology of Education. https://doi.org/10.1080/0962021980020022

Broadfoot, P., & Black, P. (2004). Redefining assessment? The first ten years of assessment in education. Assessment in Education: Principles, Policy & Practice11(1), 7–26. https://doi.org/10.1080/0969594042000208976

Hall, C. (1982). Giving More Choice to Students in Economic Education; Results and Evaluation. Journal of Economic Education. https://doi.org/10.1080/00220485.1982.10844983

Hanafin, J., Shevlin, M., Kenny, M., & Neela, E. M. (2007). Including young people with disabilities: Assessment challenges in higher education. Higher Education54(3), 435–448. https://doi.org/10.1007/s10734-006-9005-9

Isaacs, T. (2010). Educational assessment in England. Assessment in Education: Principles, Policy & Practice17(3), 315–334. https://doi.org/10.1080/0969594x.2010.491787

Konur, O. (2006). Teaching disabled students in higher education. Teaching in Higher Education11(3), 351–363. https://doi.org/10.1080/13562510600680871

Madaus, G. F. (1986). Measurement Specialists: Testing the Faith-A Reply to Mehrens. Educational Measurement: Issues and Practice5(4), 11–14. https://doi.org/10.1111/j.1745-3992.1986.tb00492.x

McClelland, D. E. (1973). Testing for competence rather than for “intelligence.” American Psychologist28(1), 1–14. https://doi.org/10.1037/h0034092

Nägele, C., & Stalder, B. E. (2017). Competence and the need for transferable skills.

Competence-based vocational and professional education: Bridging the worlds of work and education23, 739-753.https://doi.org/10.1007/978-3-319-41713-4_34

Papaleontiou-Louca, E., Varnava-Marouchou, D., Mihai, S., & Konis, E. (2014). Teaching for creativity in universities. Journal of Education and Human Development3(4), 131-154. DOI: 10.15640/jehd.v3n4a13

Sambell, K., McDowell, L., & Brown, S. (1997). “But is it fair?”: An exploratory study of student perceptions of the consequential validity of assessment. Studies in Educational Evaluation23(4), 349–371. https://doi.org/10.1016/s0191-491x(97)86215-3

Shavelson, R. J., Beckum, L., & Brown, B. (1974). A criterion-sampling approach to selecting patrolmen. Police Chief, 41(9), 55–61

Shavelson, R. J., Zlatkin-Troitschanskaia, O., & Mariño, J. P. (2018). International Performance Assessment of Learning in Higher Education (iPAL): Research and Development. Springer EBooks, 193–214. https://doi.org/10.1007/978-3-319-74338-7_10

Struyven, K., Dochy, F., & Janssens, S. (2005). Students’ perceptions about evaluation and assessment in higher education: a review1. Assessment & Evaluation in Higher Education30(4), 325–341. https://doi.org/10.1080/02602930500099102

Suleman, F. (2018). The employability skills of higher education graduates: insights into conceptual frameworks and methodological options. Higher Education, 76, 263-278. https://doi.org/10.1007/s10734-017-0207-0

Vincent, J. B. (2006). Children writing: Multimodality and assessment in the writing classroom. Literacy40(1), 51–57. https://doi.org/10.1111/j.1467-9345.2006.00426.x

Wilson, C., Lennox, P.P., Hughes, G.& Brown, M. (2017) How to develop creative capacity for the fourth industrial revolution: creativity and employability in higher education. In Reisman, F. (Ed.), Creativity, Innovation and Wellbeing (pp. 241-274) . London: KIE Conference Publications. https://core.ac.uk/download/pdf/130982911.pdf

Zawacki-Richter, O., Conrad, D., Bozkurt, A., Aydin, C. H., Bedenlier, S., Jung, I., Stoter, J., Veletsianos, G., Blaschke, L., Bond, M. , Broens, A., Bruhn, E., Dolch, C., Kalz, M., Kondakci, Y., Marin, V., Mayrberger, K., Muskens, W., Naidu,S., … & Kerres, M. (2020). Elements of open education: An invitation to future research. International Review of Research in Open and Distributed Learning21(3)319-334.https://doi.org/10.19173/irrodl.v21i3.4659