Exploring Reflection in Creative Experiences

Author: Corey John Ford

Reflection is crucial in any creative practice – as you experiment with your materials, each tweak will pose a question. Has this improved my work? Does this meet my artistic intentions? What might others think of this decision? For instance, a composer will continually adjust notes until they decide on their final melodies. An author will ponder over individual sentences. A dancer will refine and select their movements when creating a dance routine. In this blog post, I will describe how reflection can be characterised in creative user experiences, based on my upcoming research paper1 to be presented at ACM CHI 2023 – the top conference in the field of Human-Computer Interaction! CHI brings together researchers and practitioners from various disciplines to explore the latest advances in human-computer interaction, and I’m excited to share my findings with this community. Whether you’re working in the music industry, academia, or any other creative field, this research paper will help you to gain deeper insights into the reflective processes driving innovation and creativity.

In my PhD research, I am exploring the role of reflection in creative experiences, with a particular focus on the use of music software that incorporates generative AI. These types of systems have been in the news lately, for example, ChatGPT demonstrating the potential of AI to generate novel and creative output. My goal is to investigate whether the use of such software can support and enhance reflection in music creation, and if so, what types of reflection are most effective. In this way, my research follows a human-centred approach towards designing an AI tool which amplifies, augments and enhances human creativity rather than trying to automate it2.

Studying reflection in people’s use of music software is a challenging task requiring research with human participants and necessitating careful research design. If you ask people the wrong questions, you may get misleading results that do not accurately reflect their true reflective processes. Also, open-ended questions do not provide a quantifiable measure of reflection, making it difficult to compare results across different studies or to make recommendations to research teams or businesses on how to design more reflective systems.

I realised that there was no existing questionnaire that could effectively measure reflection in creative experiences. This gap in the literature inspired me to develop the Reflection in Creative Experience Questionnaire (RiCE). As people use technology to be creative, RiCE helps us to identify reflection in their creative user experiences. With RiCE, I hope to support not only my own research on AI and Music, but also that of other creative technology researchers looking to support reflection in the creative process. 

After consulting with experts on creativity (including AIM PhD students), crowdsourcing feedback from 300+ users of creative tools, and conducting statistical analysis of this data, RiCE includes questions for four types of reflection which might occur in creative practice:

  • Reflection on Current Process. Do people reflect on their creative process? RiCE asks users: i) “Whilst being creative, I liked to think about my actions to find alternative ways of doing them”; and ii) “I often re-examined things I’d already learnt”.
  • Reflection on Self. Do people reflect on themselves and their personal experience? RiCE asks users: i) “I learned many new things about myself during the experience”; and ii) “I pondered over the meaning of what I was doing in relation to my personal experiences.”
  • Reflection on Past Experience. Do people look back on their past experiences: RiCE asks users: i) “I explored my past experiences as a way of understanding new ideas”; and ii) “Whilst creating, I thought back on some of my past experiences.”
  • Reflection through Experimentation. Do people reflect on ideas tested when using technology? RiCE asks users: i) “I made comparisons within the system to consider alternative ways of doing things”; and ii) “I often generated, tested and revised ideas.”

Figure 1: The types of reflection which might occur in creative practice.

RiCE might be a valuable tool as it can help to identify reflection in creative user experiences and support the design of reflective tools. This could benefit people interested in studying creative user experiences, designing creative user experiences, and even in designing AI systems which support reflection. If you’re interested in testing RiCE in your own work, please don’t hesitate to contact me at It’d be fantastic to collaborate with other researchers and explore new avenues for studying reflection in creative experiences.

1 = Corey Ford and Nick Bryan-Kinns. 2023. Towards a Refection in Creative Experience Questionnaire. In Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems (CHI ’23), April 23–28, 2023, Hamburg, Germany. ACM, New York, NY, USA, 16 pages. 3544548.3581077. Pre-print:

2 = Ben Shneiderman. 2022. Human-Centered AI. Oxford University Press, UK.

AIM at ISMIR 2022

On 4-8 December 2022, several AIM PhD students will participate at the 23rd International Society for Music Information Retrieval Conference (ISMIR 2022). ISMIR is the leading conference in the field of music informatics and this year it will take place in Bengaluru, India, and online.

As in previous years, AIM will have a strong presence at the conference, both in terms of numbers and overall impact.

In the Technical Programme, the following papers are authored/co-authored by AIM students:

On Tutorials, AIM PhD students Christian Steinmetz and Soumya Vanka will be co-presenting a tutorial on “Deep learning for automatic mixing”.

On Special Sessions, AIM PhD student Lele Liu is joining as panellist the special session on PhD in MIR: Challenges and Opportunities.

On the Late-breaking/Demo (LBD) session, the following extended abstracts are authored/co-authored by AIM students:

See you at ISMIR!

DMRN 2022

The AIM 2021 cohort will be organising the Digital Music Research Network (DMRN) 2022 workshop on 20th December 2022. The workshop is hosted at QMUL on an annual basis by the Centre for Digital Music and aims to promote research in Digital Music by bringing together researchers from academia and industry in electronics engineering, computer science, and music.

The workshop will include a keynote talk by Dr. Sander Dieleman on generative modelling and iterative refinement. Submission of abstracts for talks or posters are open until 20th November.

If you are a possible AIM CDT 2023 applicant, you are welcome to come and talk to us on the event (contact us for registration). For more details, please refer to the relevant website .