Informal learning from dealing with software-related problems in the digital workplace

Tamara Vanessa Leiß (Mannheim Business School (MBS), University of Mannheim, Mannheim, Germany)
Andreas Rausch (Mannheim Business School (MBS), University of Mannheim, Mannheim, Germany)

Journal of Workplace Learning

ISSN: 1366-5626

Article publication date: 27 September 2023

Issue publication date: 18 December 2023

572

Abstract

Purpose

This paper aims to examine the impact of problem-solving activities, emotional experiences and contextual and personal factors on learning from dealing with software-related problems in everyday office work.

Design/methodology/approach

To measure the use of problem-solving activities, emotional experiences and the contextual factors of problem characteristics and learning in situ, a research diary was used. To measure team psychological safety (contextual factor) and personal factors, including the Big Five personality traits, occupational self-efficacy and technology self-efficacy, the authors administered a self-report questionnaire. In sum, 48 students from a software company in Germany recorded 240 diary entries during five working days. The data was analysed using multilevel analysis.

Findings

Results revealed that asking others and using information from the internet are positive predictors of self-perceived learning from a software-related problem, while experimenting, which was the most common activity, had a negative effect on learning. Guilt about the problem was positively related to learning while working in the office (as opposed to remote work), and feeling irritated/annoyed/angry showed a negative effect. Surprisingly, psychological safety had a negative effect on perceived learning.

Research limitations/implications

Major limitations of the study concern the convenience sample and the disregard for the sequence of the activities.

Originality/value

This study contributes to the limited empirical evidence on employees’ problem-solving activities and informal workplace learning in the software context. To overcome the shortcomings of previous studies using retrospective assessments and in-lab observations, this study uses the diary method to investigate in situ.

Keywords

Citation

Leiß, T.V. and Rausch, A. (2023), "Informal learning from dealing with software-related problems in the digital workplace", Journal of Workplace Learning, Vol. 35 No. 9, pp. 291-310. https://doi.org/10.1108/JWL-03-2023-0042

Publisher

:

Emerald Publishing Limited

Copyright © 2023, Tamara Vanessa Leiß and Andreas Rausch.

License

Published by Emerald Publishing Limited. This article is published under the Creative Commons Attribution (CC BY 4.0) licence. Anyone may reproduce, distribute, translate and create derivative works of this article (for both commercial & non-commercial purposes), subject to full attribution to the original publication and authors. The full terms of this licence may be seen at http://creativecommons.org/licences/by/4.0/legalcode


Introduction

In the present study, we investigate the antecedents of informal workplace learning from dealing with software-related problems in the workplace. For several reasons, it is assumed that informal learning will become increasingly important compared to formal learning (Littlejohn and Pammer-Schindler, 2022). First, software has become the most important tool for knowledge workers (Leiß et al., 2022; Littlejohn and Margaryan, 2014b, 2014a) and is subject to frequent updates and changes (Kiani et al., 2020). Oftentimes, users have to troubleshoot and learn software by themselves in the course of their regular working tasks (Kiani et al., 2020), as it is challenging to cover these rapid developments with formal training (Kiani et al., 2020; Harteis, 2022). Second, while standard curriculums have been successful in training employees in standard work practices, knowledge workers increasingly have to deal with “fuzzy tasks” (Harteis, 2022, p. 417), as well as complex and niche problems (Littlejohn and Pammer-Schindler, 2022). Thus, problem-solving is a central requirement for knowledge workers, especially when dealing with complex software tools, but, at the same time, problem-solving is also considered an important source of informal learning in the workplace (Rausch, 2013; Kiani et al., 2020; Tynjälä, 2013; Tynjälä and Häkkinen, 2005).

To solve work-related problems, knowledge workers usually have access to various resources (Rausch et al., 2015; Cuyvers et al., 2016; Kiani et al., 2019; Kiani et al., 2020). These can comprise personal, social and technological resources (Leiß et al., 2022). Although there is some empirical work on employees’ use of resources and problem-solving activities and learning (Rausch et al., 2015; Cuyvers et al., 2016; Haemer et al., 2017; Kooken et al., 2007), with some focusing specifically on the software context (Andrade et al., 2009; Leiß et al., 2022; Kiani et al., 2020; Kiani et al., 2019; Novick et al., 2009), empirical evidence is scarce. Furthermore, previous research has mostly relied on interviews, in-lab observations and, occasionally, surveys. While interviews and surveys suffer from the disadvantage of being based on retrospective assessments, in-lab observations take place outside of real work activities, affecting their validity. The diary method as a research method enables retrospective memory bias to be reduced as well as an investigation in situ (Rausch et al., 2022; Bolger et al., 2003; Ohly et al., 2010). Thus, it is particularly appropriate for researching the handling of software (Benbasat and Barki, 2007; Littlejohn and Margaryan, 2014b). To the best of the authors’ knowledge, no empirical study has examined software-related problem-solving and the resulting learning by applying intensive longitudinal methods such as diaries. To fill this research gap, the present study investigates employees’ use of resources in solving software-related problems and the resulting learning by using a research diary.

Learning and problem-solving are complex phenomena that are influenced by various antecedents (Cerasoli et al., 2018; Jeong et al., 2018; Littlejohn and Pammer-Schindler, 2022; Noe et al., 2014; Tynjälä, 2008; Vu et al., 2022; Rintala et al., 2019). Thus, our study takes into account emotional experiences as well as the contextual factors of problem characteristics, team psychological safety and the location of work, in addition to the personal factors of the Big Five personality traits and self-efficacy, as potential antecedents of learning from software-related problem-solving.

Informal learning through problem-solving

A large proportion of workplace learning takes place informally (Eraut, 2010; Tynjälä, 2013). Thus, it occurs apart from formally organised learning programs (Eraut, 2000; Marsick and Watkins, 2015), in the absence of a teacher and is oftentimes unstructured, unintended and implicit (Eraut, 2004; Marsick and Watkins, 2015). A central source of informal workplace learning is solving work-related problems (Eraut, 2000, 2004; Tynjälä, 2013; Tynjälä and Häkkinen, 2005). A problem can be defined as a situation in which an individual lacks knowledge on how to achieve a specific goal (Newell and Simon, 1972). Problem-solving usually comprises researching, acquiring and applying new knowledge and may therefore result in learning (Dörner and Wearing, 1995; Newell and Simon, 1972; Wüstenberg et al., 2012). Help-seeking behaviour is conceptually closely related to problem-solving, as help-seeking is always associated with a specific problem that needs to be solved and may result in learning (Lee, 1997; van der Rijt et al., 2013). Characteristics of help-seeking include the involvement of more than one person and a certain proactivity within help-seeking (Lee, 1997; van der Rijt et al., 2013). Other studies explicitly further included non-personal interactions such as forums, text tutorials and videos as well as other resources on the web (Leiß et al., 2022; Kiani et al., 2020; Kiani et al., 2019).

In general, there are several information resources employees can refer to when they face work-related problems in the workplace. In the model of informal workplace learning through problem-solving, Leiß et al. (2022) classify them into personal resources, social resources and technological resources. These resources, in turn, offer several problem-solving activities that employees can use to solve problems at hand. Activities based on personal resources comprise reflecting and trying out. Activities based on social resources include observing competent others as well as asking competent others. Activities based on technological resources refer to consulting codified information, including physical and digital information and tools. Similar activities were reported by Cuyvers et al. (2016) and Haemer et al. (2017). Furthermore, the model highlights the role of personal and contextual factors as well as emotional experiences in problem-solving and learning.

Emotional experiences and workplace learning are strongly connected (Benozzo and Colley, 2012), and there is empirical evidence that emotional experiences impact workplace learning (Rausch et al., 2017; Benozzo and Colley, 2012; Hökkä et al., 2020; Zhao, 2011). Furthermore, emotional experiences affect information-seeking in general, which may also influence learning. Different emotions influence the sources, the start, a potential limitation, the termination and the avoidance of information-seeking in different ways (Savolainen, 2014; Willson and Given, 2020). Feeling stressed, for instance, also causes early-career academics to ask colleagues instead of using codified information (Willson and Given, 2020). In addition, Zhang and Jansen (2009) found that happy people processed more general information, while sad people processed more specific information during an internet search. Within the context of problem-solving, Spering et al. (2005) found participants with negative-induced emotions to be more thorough when searching for information during their problem-solving attempts and to be more likely to search for information before they started their problem-solving attempts. These findings show that emotional experiences influence not only whether and where people seek information but also how they use and process it.

Empirical research on solving software-related problems

There are a few empirical studies that investigated help resources and problem-solving activities and their influence on problem-solving in the context of workplace learning. In this vein, intrinsic and extrinsic reflection (Haemer et al., 2017), seeking help from others (Haemer et al., 2017; Kooken et al., 2007), interactions with others (Cuyvers et al., 2016), trial-and-error (Haemer et al., 2017; Kooken et al., 2007; Cuyvers et al., 2016), observing (Cuyvers et al., 2016) and consulting (online) written material (Kooken et al., 2007; Cuyvers et al., 2016) were identified to support workplace learning.

Within the software context, in a previous study, Leiß et al. (2022) revealed consulting and observing colleagues as well as reflecting to be most often available and most often used to tackle enterprise resource planning (ERP) software-related problems. However, this was without clear reference to potential learning. Further studies found recalling (Andrade et al., 2009) and asking colleagues (Kiani et al., 2020; Novick and Ward, 2006) to enhance software use and learning. Regarding different online resources like videos or forums (Kiani et al., 2020; Kiani et al., 2019; Novick and Ward, 2006), built-in help (Novick et al., 2009; Andrade et al., 2009; Kiani et al., 2019) and trial-and-error (Novick et al., 2009; Novick and Ward, 2006; Andrade et al., 2009), empirical evidence on the frequency of use and usefulness for software use and learning is ambiguous. These few conducted studies share several limitations. Most studies only anticipated learning from task performance, were not field studies and were not conducted in situ.

Antecedents of learning from problem-solving

We distinguish contextual and personal factors as antecedents of learning from problem-solving (Cerasoli et al., 2018; Rintala et al., 2019; Tynjälä, 2013; Vu et al., 2022). As contextual factors, we assume that the location of work, team psychological safety and problem characteristics affect the use of problem-solving activities as well as the resulting learning.

Physical proximity can increase the likelihood that employees learn from each other (Škerlavaj and Dimovski, 2006), while geographical, temporal and perceived separation can negatively impact team communication and the synchronous availability of team members (Morrison-Smith and Ruiz, 2020). Thus, we suppose that with a person’s greater separation from his or her team, it may be more difficult to ask other team members for help. Instead, other learning resources are used. This may impact learning outcomes.

Team psychological safety is “a shared belief that the team is safe for interpersonal risk taking” (Edmondson, 1999, p. 354), resulting, for example, in people daring to talk about problems and mistakes or ask for help without fear of losing face (Edmondson, 1999). Psychological safety is a positive antecedent of individual and team workplace learning (Edmondson and Lei, 2014; Frazier et al., 2017; Newman et al., 2017) and affects, for example, learning from failures (Carmeli and Gittell, 2009), proactive learning behaviours (Mornata and Cassar, 2018), cooperative learning (Post, 2012) and different forms of team learning behaviour (Edmondson, 1999; Harvey et al., 2019). Hence, in teams with high team psychological safety, members may be more likely to dare to reveal their lack of knowledge and to ask other members for help, for instance. This assumption is in line with the results of van der Rijt et al. (2013), who found that trust, which is a concept that overlaps with psychological safety (Newman et al., 2017; Edmondson and Lei, 2014), is a significant positive predictor of asking for help (van der Rijt et al., 2013).

Generally, work-related problems can be shaped by different characteristics, like, for example, their structuredness (well-structured vs. ill-structured) (Jonassen, 1997), complexity (Smith, 1991), familiarity (Smith, 1991), difficulty (Smith, 1991), urgency (Rausch et al., 2015), severity (Rausch et al., 2015; Feng and MacGeorge, 2006) or responsibility (Rausch et al., 2015; Feng and MacGeorge, 2006). Three problem characteristics that we expect to be particularly relevant for the present study are urgency, potential negative consequences and the extent to which a person feels guilty for the occurrence of the problem. The urgency of a problem, as well as its potentially negative consequences, can motivate people to solve it quickly and efficiently, which is likely to result in learning. Perceived responsibility can lead to employees not asking others for help so as not to reveal the problem to them, which can also impact learning.

Concerning personal factors, we expect the Big Five personality traits as well as self-efficacy to impact solving software-related problems and learning in the workplace. The Big Five personality traits refer to the five broad personality dimensions of extraversion, neuroticism, agreeableness, openness to experience and conscientiousness (McCrae and Costa, 1999). They influence informal learning in the workplace (Cerasoli et al., 2018; Noe et al., 2013; Rintala et al., 2019), as well as aspects related to technology acceptance and use (Barnett et al., 2015; Devaraj et al., 2008; Özbek et al., 2014). As, for example, people high in extraversion are sociable, talkative and friendly (McCrae and Costa, 1987), we expect them to be more likely to ask other people than use other (software-based) problem-solving activities when they face a software-related problem. This could also ultimately influence learning outcomes.

Furthermore, the Big Five personality traits influence an individual’s information-seeking behaviour in general, which could also impact learning. Al-Samarraie et al. (2017) investigated the influence of personality traits on information-seeking behaviour. Results showed that when searching for a specific piece of information on the internet, people with high conscientiousness are quicker to retrieve information and decide than people with high agreeableness and extraversion. When searching for information on the internet, which requires evaluation in terms of quantity and quality, extraverts were the quickest to find it, followed by people high in agreeableness and people high in conscientiousness. When it comes to using facets and refining queries in an internet-based search process, that is, conducting complex information research, extraverts and people high in conscientiousness performed the best because of their information-processing strategies. In addition, Heinström (2005) found three different information-seeking patterns connected to the Big Five personality traits. The results yielded that the information-seeking behaviour of fast surfing, characterised by effortless information seeking, using information confirming old views, problems with a critical analysis of detected information and a lack of time, was positively influenced by conservativeness (low openness to experience). Extraversion, openness to experience and low agreeableness were found to be positive predictors of broad scanning, which is a behaviour comprising thorough and wide information seeking. The information-seeking behaviour of deep diving includes larger efforts to find information of only the highest quality. This behaviour was not significantly affected by any of the Big Five personality traits (Heinström, 2005).

Self-efficacy is a further predictor of informal workplace learning (Cerasoli et al., 2018; Choi and Jacobs, 2011; Jeong et al., 2018; Rintala et al., 2019) and can be defined as “people’s judgement of their capabilities to organise and execute courses of action required to attain designated types of performances” (Bandura, 1986, p. 391). Self-efficiency can also be transferred to computer and IT use. In some studies, computer self-efficacy was identified as a positive predictor of technology acceptance and use (Lee et al., 2003; Venkatesh, 2000; Venkatesh and Bala, 2008). Self-efficacy may also impact the use of activities for solving software-related problems. For instance, it may have an influence on whether a user trusts himself or herself to independently solve such a problem. Then high self-efficacy might be related to being more likely to use problem-solving activities that do not involve other people, such as trying out or searching on the internet. Supporting this assumption, Cleavenger et al. (2007) also expect people with high self-efficacy to be less likely to ask other people about a problem at hand because of their belief in their own ability to handle the problem. Although Cleavenger et al. (2007) found no empirical evidence for this relationship, we nevertheless consider it plausible.

Altogether, given prior research’s results and shortcomings discussed above, our study addresses the following research questions: How do (RQ1) problem-solving activities; (RQ2) emotional experiences; (RQ3) contextual factors; and (RQ4) personal factors influence learning from solving software-related problems?

Method

Participants

To address the above research questions, a diary study with 49 students from a software company in Germany was conducted. Participation was voluntary, and all participants provided written informed consent. As an incentive, the participants were offered a comprehensive workshop on the topics of scientific work and writing theses for the university. Twenty-one participants were female, 28 were male, and their mean age was 22.7 years. Of the 49 participants, 48 were students at various universities, and one was a vocational education and training (VET) student. Of the university students, 31 pursued their bachelor’s degree and 17 pursued their master’s degree. As the study was conducted during the COVID-19 pandemic, participants were also asked to indicate the percentage of their working time they usually spend at home and on site in the office each week. On average, the participants worked 82% of their weekly working time remotely.

Procedure

A semi-standardised diary was used to collect data in situ. The diary period comprised five working days, and the participants were asked to record about ten software-related problems during the diary period. Participants were asked to fill in the diary as soon as possible after the problem occurred or was solved. Depending on their working hours, participants could either keep the diary for five days within one working week or spread the five working days over several weeks (usually three weeks). Before the diary period, the participants completed an additional self-report questionnaire that included demographics, personality traits, workplace characteristics and team characteristics.

Measures

Diary. Most diary items were standardised to reduce participant burden. The diary was provided digitally via the survey web app LimeSurvey and contained five content areas. First, the participants were asked to indicate if they worked remotely or on site in the office at the time the software-related problem occurred. Second, the participants rated three problem characteristics:

  1. perceived guilt for the problem (“To what extent was I to blame for the problem?”, from 1 = not my fault at all to 5 = completely my fault);

  2. potential negative consequences resulting from the problem (“How negative could the potential consequences of the problem be?”, from 1 = no negative consequences to 5 = extremely negative consequences); and

  3. the problem’s urgency (“Please assess the urgency of the problem”, from 1 = no time pressure at all to 5 = very high time pressure).

For all items, the answer not applicable was also available. Third, participants selected the problem-solving activities that were used to deal with the software-related problem at hand. These were derived from the Model of Informal Workplace Learning Through Problem-Solving by Leiß et al. (2022). Available problem-solving activities were asking another person, using information from the internet, using internal information, using software-integrated information, using university course material, using one’s own previous notes, experimenting (trying out) until the problem is solved and observing other people while they dealt with similar problems. Fourth, the participants indicated their emotional experience when dealing with the problem using the circumplex item of emotional experience (Rausch, 2014). Based on Russell (1980) and similar frameworks, eight emotional states were arranged with valence on the x-axis and arousal on the y-axis. The participants were asked to choose a maximum of three out of these eight emotional states they experienced when dealing with the software-related problem (1 = a little to 3 = very). Each emotional state was described using three adjectives. These were motivated/delighted/curious; confident/happy/glad; contented/accepted/proud; calm/even-tempered/daydreaming; bored/dull/uninterested; unhappy/gloomy/sad; irritated/annoyed/angry; and nervous/worried/afraid. Emotional states that were not chosen were coded as zero. Fifth, participants were asked to indicate how much they had learned from dealing with the software-related problem (“In what way did you learn something from working on the problem?” rated on a five-point Likert scale from 1 = learned nothing at all to 5 = learned a lot).

In sum, the participants recorded 242 software-related problems, two of which were excluded from further analysis because of missing data. Descriptives for the activities used for solving software-related problems are displayed in Table 1.

Self-report questionnaire

Team psychological safety. Team psychological safety was measured using the four-item scale by Harvey et al. (2019) (e.g. “In this team, it is easy to speak up about what is on your mind”). A five-point Likert scale from 1  (not agree at all) to 5 (strongly agree) was used. The scale’s consistency was good (Cronbach’s alpha = 0.83).

Occupational self-efficacy. Occupational self-efficacy was measured using the scale by Abele et al. (2000) (e.g. “I know exactly that I can fulfil the requirements of my profession if I only want to”). The scale comprised six items that were rated on a five-point Likert scale from 1  ( not agree at all) to 5 ( strongly agree). The scale’s consistency was rather low (Cronbach’s alpha = 0.67).

Technology self-efficacy. Technology self-efficacy was measured using the scale by Laver et al. (2012), which comprises ten items and is based on the computer self-efficacy measure by Compeau and Higgins (1995). The items were rated on a five-point Likert scale from 1  (not at all confident) to 5  (completely confident). It shows good consistency (Cronbach’s alpha = 0.80).

Big Five personality traits. The Big Five personality traits were measured by Saucier’s (1994) Big Five Mini Markers and their German version by Weller and Matiaske (2009), which included four adjectives for each trait. The twenty adjectives were rated on a five-point Likert scale from 1 (not agree at all) to 5  (strongly agree). Internal consistency was good for extraversion (Cronbach’s alpha = 0.89), satisfactory for agreeableness (Cronbach’s alpha = 0.71) and conscientiousness (Cronbach’s alpha = 0.69), but unsatisfactory for openness to experience (Cronbach’s alpha = 0.52) and neuroticism (Cronbach’s alpha = 0.45).

Multilevel analysis

As the diary entries were nested in persons, we conducted a multilevel analysis (Hox et al., 2018; Snijders and Bosker, 2012). Two diary entries were excluded from the multilevel analysis as they contained too many missing values. This resulted in 240 diary entries from 48 participants that were included in the multilevel analysis. According to Enders and Tofighi (2007) and Nezlek (2001), we centered predictors on Level 2 at the grand mean and predictors on Level 1 at the group mean. Furthermore, we calculated the baseline level of problem characteristics and emotional experiences and included these variables in the analysis to investigate the influence of the Level 2 differences (Enders and Tofighi, 2007; Pond et al., 2021).

The research questions were tested by a series of multilevel models using the software R (Bates et al., 2015; Kuznetsova et al., 2017; R Core Team, 2022; Wickham et al., 2022). In Model 1, we added the problem-solving activities. We excluded activities that were used fewer than 15 times for solving software-related problems in the whole data set (observing colleagues, using one’s own notes and using university course material). In Model 2, we included the emotional experiences; in Model 3, the contextual factors; and in Model 4, the personal factors. Because the Big Five personality traits of openness to experience and neuroticism did not show satisfactory scale reliability, they were excluded from the analysis. The models were calculated as random intercept models. The pseudo-R2 value was calculated based on Snijders and Bosker (2012). A table showing means, standard deviations and correlations between all study variables for n = 48 participants and n = 240 diary entries included in the multilevel analysis is included in the Appendix.

Results

First, the null model was calculated to get the intraclass correlation coefficient (ICC). The ICC for learning from dealing with software-related problems is 0.25, indicating that 25% of the variance can be explained by differences on Level 2, that is, between the participants. Then Model 1 was calculated by adding the activities for problem-solving (dummy coded with “0”, indicating that an activity was not used). Model 1 fits the data better than the null model. Table 2 shows the results of all the calculated models. Using information from the internet was identified as a significant positive predictor of self-perceived learning, while experimenting was a significant negative predictor.

In Model 2, we added the emotional experiences to the analysis. Again, this model fits the data better than the previous one. Results show that a person’s deviation from his or her baseline level of feeling irritated/annoyed/angry was a significant negative predictor of self-perceived learning. This means that higher feelings of being irritated/annoyed/angry are associated with less learning. Moreover, controlling for the emotional experiences resulted in asking others to become a further significant positive predictor of self-perceived learning from dealing with software-related problems, while experimenting was only significant at the ten percent level.

In Model 3, the contextual factors were included, which again resulted in a better model fit. The location of work (dummy coded with “0” indicating remote work and “1” indicating work in the office) was a significant negative predictor, indicating that participants learned less when they dealt with a software-related problem that occurred while they worked in the office. Moreover, the baseline level of the extent to which a person believes that he or she is to blame for the occurrence of software-related problems (Ø own guilt) as well as a person’s deviation from this general tendency (own guilt) were significant positive predictors of self-perceived learning from software-related problems.

In the course of Model 4, we added the personal characteristics. This did not lead to a significant improvement in the model fit compared to Model 3. However, the results identified occupational self-efficacy as a significant positive predictor of self-perceived learning. Controlling for the personal factors also led to team psychological safety and, once more, experimenting to become significant negative predictors.

Discussion

In the present study, we investigated the influence of problem-solving activities, emotional experiences, contextual antecedents and personal antecedents on learning from dealing with software-related problems in the workplace. In a diary study, 48 students from a German software company recorded 240 diary entries that were analysed by multilevel modelling.

Regarding the effects of problem-solving activities on learning from solving software-related problems (RQ1), the results show that although experimenting on one’s own is the most common problem-solving activity, it has a negative effect on self-perceived learning. There is conflicting evidence with respect to the frequency of using experimentation for problem-solving. While the high frequency we found is consistent with the findings of Andrade et al. (2009) and Novick et al. (2009), who found a high use of trial-and-error approaches in software-related problem solving, Rausch et al. (2015) found only little use of this strategy for general work-related problems in office work. In one of our previous studies, we also found rather lower usage for experimenting compared with other activities for solving ERP-related problems (Leiß et al., 2022). Thus, our results partly confirm and partly contradict previous research on the usage frequency of experimenting for solving (software-related) problems. What is surprising at first is the direction of the effect. The little evidence available so far tends to support the opposite effect. Novick et al. (2009), Haemer et al. (2017), Cuyvers et al. (2016) and Andrade et al. (2009), in their cases in combination with other problem-solving activities like recalling and helping, reported positive effects of experimenting on problem-solving success. However, only Haemer et al. (2017) and Cuyvers et al. (2016) referred to learning, but not in a software context. We assume that the effect of experimenting on learning may strongly depend on the complexity of the software and the problems encountered. Presumably, only simple problems can be efficiently solved with trial and error, but they offer little learning opportunity. As already stated in the introduction, knowledge workers are facing increasingly complex tasks and problems. Thus, experimenting may not be well suited for them and their problems and consequently has no positive impact on problem-solving performance or potential learning. An additional explanation would be that users need a certain amount of prior knowledge to learn effectively from experimenting (Haemer et al., 2017). Since our participants had rather less work experience, this could also explain the negative effect of experimenting on learning.

Moreover, Novick et al. (2009) identified three factors affecting experiment outcomes. First, these are evident, hidden or false affordances. This includes cues and signposts in a software’s user interface that make it appear that they lead to a certain function but in fact do not. A further factor affecting experiment outcomes was the match or mismatch of vocabulary. For example, a user often has a term in mind for a certain function and searches for it in the menu or on the user interface. If a different term is used there or a different path leads to the term, the problem solution may fail. The third factor affecting experiment outcomes is users’ incomplete or wrong mental models. In this case, users have not fully understood how software works or how individual elements are interrelated, which also affects the problem-solving and learning negatively (Novick et al., 2009). We believe it is very likely that these factors also played a role among our participants and may explain the negative influence of experimenting on learning that we found.

Asking others and using information from the internet were the second and third most common activities for problem-solving and both were significant positive predictors of learning. This is in line with the results of other studies that found that asking colleagues (Rausch et al., 2015; Leiß et al., 2022; Novick and Ward, 2006; Cuyvers et al., 2016; Kiani et al., 2020; Kooken et al., 2007; Haemer et al., 2017) and internet resources (Leiß et al., 2022; Cuyvers et al., 2016; Kiani et al., 2020; Kiani et al., 2019; Kooken et al., 2007) were rather frequently used and effective. As reasons for asking other people instead of using some sort of codified information, Kiani et al. (2020) found task-specific help needs; availability of company best-practices; and problems to find codified information and vocabulary problems when using other problem-solving activities (e.g. online research, manuals). Other authors reported similar problems when using problem-solving activities other than asking other people. These were problems related to the finding and identification of relevant information, as well as unsuitable levels of explanation, difficulty of navigation and vocabulary problems, especially for newcomers during help-seeing and learning (Kooken et al., 2007; Novick et al., 2009; Kiani et al., 2019; Novick and Ward, 2006; Andrade et al., 2009). We imagine that the above reasons were why our participants with rather limited work experience often relied on asking others, and that not having to face these difficulties when solving problems certainly supports learning.

Regarding the high usage of information from the internet, other studies found that online help is used because participants are familiar with it in another professional or personal context (Kiani et al., 2019), and often users do not want to use the printed manual because of navigation problems, outdated information, bulkiness and insufficient level of detail but instead use online help or online documentation (Novick and Ward, 2006). These could be reasons why the participants in our study used information from the internet frequently, and again, problem-solving activities and sources that do not present the aforementioned difficulties will certainly promote learning better.

Addressing the role of emotional experiences on learning from solving software-related problems (RQ2), an above-average experience of being irritated/annoyed/angry is negatively related to self-perceived learning. Our results support Savolainen’s (2014) findings, which showed that negative emotions like anxiety, aversion, fear and irritation can limit and terminate people’s information-seeking, and we expect this can impact learning. Moreover, within the control-value theory, anger is categorised as a negative activating emotion that can lead either to positive or negative learning outcomes, depending on the task and contextual conditions (Pekrun et al., 2011; Pekrun and Stephens, 2010). These ambivalent outcomes are also reflected in the empirical results (Rausch et al., 2017; Callister et al., 2017; Loderer et al., 2020; Reio and Callahan, 2004). The negative relationship we identified is in line with the results by Pekrun et al. (2011). However, the causality is unclear because irritation and anger towards the respective software or problem at hand could also be the result of a lack of problem-solving and learning success.

Including contextual factors in the analysis (RQ3), working on site in the office (as opposed to working remotely) was a significant negative predictor of learning from software-related problems, indicating that participants learned less when they dealt with a software-related problem that occurred while they worked in the office. Furthermore, both the baseline level of guilt (Ø own guilt) as well as the situational above-average experience of guilt (own guilt) were both significant positive predictors of learning from software-related problems. Feng and MacGeorge (2006) assume that perceived responsibility for a problem may influence both receptivity to advice and the feeling that the problem is solvable, or fear of losing face and fear of negative evaluation by others, leading to resistance to advice. Feng and MacGeorge (2006) found no significant effect of responsibility on the receptiveness of advice. Perhaps, however, the direction of effect we found is also because of the former assumption, and therefore there is a positive effect on learning. Although there are studies that did not find a direct positive effect of guilt on learning (Rausch et al., 2017; Zhao, 2011), the direct positive effect we found is in line with the results of Liu and Xiang (2018).

The negative effect of working in the office is surprising, as we had expected that the opportunities to learn from the help of colleagues would be greater in the office (Škerlavaj and Dimovski, 2006). Besides, face-to-face interactions are often preferred and useful for problem-solving and learning (Kooken et al., 2007; Kiani et al., 2020) and, as already discussed, the results of the present study identified asking others to be a significant positive predictor of learning from solving software-related problems. It is possible that the positive impact of remote work on learning can be explained by the fact that remote workers rely more on using information from the internet, which also proved to be a significant positive predictor of self-perceived learning. However, there is no significant positive correlation between working remotely and using information from the internet. A further explanation for the unexpected results may be that working remotely allows employees to take more time to reflect and elaborate on a problem, which, in turn, fosters learning (Haemer et al., 2017).

Finally, including personal factors related to learning from solving software-related problems (RQ4) did not improve the model fit. Still, the significant positive influence of occupational self-efficacy is well in line with the existing research (Cerasoli et al., 2018; Jeong et al., 2018; Rintala et al., 2019). Most surprisingly, when controlling for personal factors, team psychological safety turned out to be a significant negative predictor of learning, which contradicts the findings of previous studies (Edmondson and Lei, 2014; Frazier et al., 2017; Newman et al., 2017). One plausible explanation might be that high psychological safety may lead to turning to others too quickly without even trying to solve the problem by oneself, to delegating problems completely or to wasting time with unimportant things (Edmondson and Lei, 2014). This would mean losing learning opportunities, while lower psychological safety forces one to solve the problem on one’s own, thus taking advantage of learning opportunities. However, this would question the positive effects of asking others as described above, and such a relationship cannot be found in the correlations. Team psychological safety is significantly positively correlated with working remotely and experimenting on Level 1. Because of the physical distance between colleagues, it may feel safer to just experiment when a person encounters a problem while working remotely. The obvious lack of knowledge cannot be observed directly by colleagues. Experimenting, however, is associated with less learning.

Limitations and further research

Our research is subject to several limitations. Some of the problem-solving activities were excluded from the analysis because they occurred too rarely, and for the activity of asking other people, we did not differentiate if the participants asked face-to-face or used communication tools. Furthermore, we did not consider the order in which the activities were performed, which would reveal more complex strategies. Moreover, we had a rather small convenience sample of rather young employees from only one company. Thus, the generalisability of our results is limited. Finally, learning was measured by only one diary item.

Altogether, using the diary method revealed deep insights into the complex processes of learning from software-related problem-solving in the workplace. The two strands of research-research on solving software-related problems and research on learning from problem-solving in the workplace – should be further integrated. Our study revealed several surprising results that should be investigated in replication studies and could be enhanced by qualitative data such as interviews or observational data. Considering the content, further research on possible mediators of the relationship between psychological safety and learning from problem-solving in the workplace would be interesting. Furthermore, the availability and use of software and tools for solving software-related problems (e.g. asking colleagues via communication platforms vs face-to-face) would be an exciting dimension. Finally, the content and extent of perceived learning could be captured in a more differentiated way.

Descriptive statistics for problem-solving activities

Problem-solving activity Usage absolute Usage in % of all problems
Experimenting (trying out) 123 51.3
Using information from the internet 88 36.7
Asking another person 81 33.8
Using internal information 19 7.9
Using software-integrated information 16 6.7
Using one’s own previous notes 11 4.6
Observing another person 9 3.8
Using university course material 4 1.7
Notes:

Multiple responses are allowed. Usage in % of all problems for n = 240 problems

Source: Authors’ own work

Multilevel estimates for multilevel modelling predicting self-perceived learning from dealing with software-related problems

Null model Model 1 Model 2 Model 3 Model 4
Predictors Estimate SE t Estimate SE t Estimate SE t Estimate SE t Estimate SE t
Intercept 2.99*** 0.17 17.72 2.81*** 0.26 10.654 2.30*** 0.62 3.691 1.73* 0.71 2.423 1.94* 0.74 2.627
Problem-solving activities
Asking others 0.40 0.25 1.608 0.48* 0.23 2.065 0.50* 0.22 2.236 0.50* 0.22 2.217
Using information from the internet 0.81*** 0.23 3.457 0.75** 0.23 3.316 0.62** 0.22 2.810 0.54* 0.22 2.406
Using internal information 0.35 0.37 0.941 0.49 0.36 1.367 0.47 0.35 1.347 0.48 0.34 1.403
Using software-integrated information 0.24 0.40 0.600 0.44 0.38 1.171 0.40 0.37 1.094 0.52 0.37 1.422
Experimenting −0.59** 0.22 −2.607 −0.43 0.22 −1.965 −0.38 0.21 −1.783 −0.43* 0.21 −2.035
Emotional experience
Ø motivated/delighted/curious 0.41 0.30 1.387 0.21 0.27 0.803 −0.14 0.29 −0.492
Ø confident/happy/glad 0.06 0.40 0.141 −0.11 0.37 −0.305 −0.06 0.36 −0.152
Ø contented/accepted/proud −0.32 0.34 −0.919 −0.29 0.31 −0.946 −0.36 0.30 −1.202
Ø calm/even-tempered/daydreaming 0.19 0.33 0.590 0.10 0.29 0.348 0.28 0.28 1.011
Ø nervous/worried/afraid 0.01 0.43 0.014 −0.47 0.44 −1.071 −0.51 0.43 −1.187
Ø bored/dull/uninterested −0.03 0.43 −0.062 0.35 0.42 0.836 0.45 0.41 1.095
Ø unhappy/gloomy/sad −0.04 0.58 −0.068 −0.07 0.49 −0.142 0.13 0.46 0.287
Ø irritated/annoyed/angry 0.05 0.33 0.146 −0.06 0.29 −0.210 −0.10 0.30 −0.350
Motivated/delighted/curious 0.10 0.11 0.855 0.14 0.11 1.279 0.15 0.11 1.357
Confident/happy/glad 0.21 0.13 1.659 0.18 0.13 1.418 0.18 0.13 1.421
Contented/accepted/proud 0.11 0.15 0.713 0.11 0.14 0.766 0.12 0.14 0.820
Calm/even-tempered/daydreaming −0.22 0.15 −1.490 −0.18 0.15 −1.226 −0.18 0.15 −1.223
Nervous/worried/afraid −0.09 0.14 −0.645 −0.11 0.14 −0.794 −0.10 0.14 −0.727
Bored/dull/uninterested 0.00 0.17 −0.020 0.02 0.17 0.111 0.03 0.17 0.179
Unhappy/gloomy/sad −0.14 0.19 −0.726 −0.14 0.18 −0.770 −0.15 0.18 −0.825
Irritated/annoyed/angry −0.36** 0.13 −2.871 −0.35** 0.12 −2.774 −0.34** 0.12 −2.750
Contextual factors
Location of work (0 = remote work, 1 = work in the office) −0.74* 0.30 −2.480 −0.68* 0.29 −2.291
Ø own guilt 0.33** 0.12 2.684 0.42** 0.13 3.113
Ø negative consequences 0.02 0.13 0.165 0.11 0.13 0.866
Ø urgency 0.08 0.14 0.588 −0.08 0.17 −0.477
Own guilt 0.13* 0.06 2.325 0.14* 0.06 2.413
Negative consequences −0.06 0.06 −0.881 −0.06 0.06 −0.911
Urgency 0.10 0.08 1.261 0.11 0.08 1.346
Team psychological safety −0.45 0.24 −1.838 −0.67** 0.25 −2.726
Personal factors
Occupational self-efficacy 0.81* 0.35 2.315
Technology self-efficacy 0.16 0.30 0.523
Extraversion −0.13 0.16 −0.839
Conscientiousness 0.63 0.36 1.726
Agreeableness −0.21 0.37 −0.551
−2 * log 917.63 887.75 850.84 823.11 814.55
Diff −2*log 29.88*** 36.903** 27.729*** 8.563
Δdf 5 16 8 5
R² 0.14 0.24 0.36 0.40
Notes:

*p < 0.05; **p < 0.01; ***p < 0.001

Source: Authors’ own work

Means, standard deviations and correlations between study variables

Variable Person level (Level 2) Problem level (Level 1) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
M SD M SD
Personal factors and team psychological safety
1. Team psychological safety 4.36 0.66 0.33* 0.20 −0.04 0.06 0.01 0.06 0.01 0.12 0.24 0.22 0.02 0.07 0.10 0.08 −0.25 0.26 0.04 0.05 −0.19 −0.17 −0.10 0.00 −0.21
2. Occupational self-efficacy 4.05 0.50 0.36* 0.21 0.26 −0.04 −0.22 0.26 0.05 −0.04 0.26 0.06 0.07 0.20 −0.31* −0.17 −0.18 −0.07 0.07 −0.07 −0.21 −0.24 0.08 −0.03
3. Technology self-efficacy 4.27 0.47 0.11 −0.12 0.02 −0.17 0.22 0.14 0.04 0.07 0.22 0.23 0.25 0.03 −0.07 −0.05 0.06 −0.04 −0.08 0.00 0.12 0.29* 0.03
4. Extraversion 3.90 0.95 0.09 0.01 −0.12 −0.04 −0.06 0.06 0.11 −0.08 −0.01 0.07 0.06 −0.31* −0.24 0.02 0.20 0.10 −0.12 0.03 0.31* −0.18
5. Conscientiousness 4.34 0.45 0.36* −0.36* 0.30* −0.20 −0.25 0.16 −0.30* −0.17 −0.22 0.01 −0.08 0.05 −0.09 0.16 0.15 −0.08 −0.07 0.07 0.07
6. Agreeableness 4.45 0.49 0.09 −0.06 0.00 −0.22 0.12 0.09 −0.12 −0.11 0.03 0.17 0.25 −0.01 0.20 −0.17 −0.23 −0.05 0.01 0.16
Problemsolving activities
7. Asking others 0.39 0.35 0.34 0.47 −0.05 −0.12 −0.11 0.00 −0.21** 0.05 −0.46** 0.33* 0.10 −0.31* 0.09 −0.05 −0.01 0.06 0.08 0.33* −0.13 0.01 −0.23 −0.10 0.01 −0.05 0.12
8. Using information from the internet 0.37 0.35 0.37 0.48 −0.04 0.13* 0.22** −0.14* 0.11 0.00 −0.25** 0.03 0.01 0.12 0.03 0.24 0.39** −0.09 −0.10 −0.17 0.07 −0.18 −0.19 0.16 −0.11 −0.06 0.22
9. Using internal information 0.09 0.19 0.08 0.27 0.04 0.03 0.09 −0.07 0.00 0.10 0.15* 0.13* 0.55** 0.12 −0.31* −0.06 0.28 −0.15 0.03 0.03 0.08 0.09 −0.19 0.10 0.06 0.08 0.14
10. Using software-integrated information 0.07 0.17 0.07 0.25 0.10 −0.01 −0.01 −0.02 −0.02 −0.08 −0.08 0.11 −0.02 0.19 0.12 −0.19 0.21 −0.12 −0.12 −0.08 0.01 0.13 −0.18 0.16 0.15 −0.03 −0.02
11. Experimenting 0.52 0.35 0.51 0.50 0.16* 0.08 −0.05 −0.05 0.21** 0.11 −0.33** −0.16* 0.07 −0.04 −0.26 −0.45** 0.08 −0.35* 0.11 0.09 0.18 0.43** 0.02 −0.35* −0.23 0.03 −0.44**
Emotional experiences
12. Motivated/delighted/curious 0.79 0.83 0.85 1.15 −0.01 0.03 0.19** −0.03 −0.18** 0.12 0.02 0.20** 0.11 −0.08 −0.14* 0.68** 0.35* 0.27 −0.13 −0.13 0.01 −0.43** −0.02 0.47** 0.28 −0.10 0.35*
13. Confident/happy/glad 0.79 0.80 0.88 1.11 0.04 0.06 0.15* 0.02 −0.22** −0.01 −0.02 0.17** −0.05 −0.06 −0.22** 0.51** 0.37** 0.35* −0.19 −0.23 0.11 −0.71** 0.00 0.47** 0.19 0.03 0.24
14. Contented/accepted/proud 0.41 0.61 0.38 0.84 0.01 0.06 0.19** −0.05 −0.13* 0.09 −0.01 0.20** 0.02 −0.08 −0.02 0.46** 0.40** −0.07 −0.15 −0.20 0.02 −0.24 −0.23 0.16 −0.10 −0.10 0.04
15. Calm/even-tempered/daydreaming 0.44 0.64 0.41 0.81 0.03 −0.05 0.08 −0.02 −0.08 0.01 −0.04 0.05 0.04 −0.05 −0.15* 0.19** 0.26** −0.05 −0.31* 0.10 −0.04 −0.24 0.18 0.32* 0.21 0.17 0.20
16. Nervous/worried/afraid 0.43 0.49 0.44 0.81 −0.14* −0.07 −0.06 −0.05 0.09 0.07 0.05 −0.02 0.15* −0.06 0.09 −0.20** −0.32** −0.21** −0.23** 0.48** 0.19 0.17 −0.16 −0.17 0.01 0.05 −0.08
17. Bored/dull/uninterested 0.31 0.50 0.28 0.68 0.12 −0.08 0.02 −0.06 0.13* 0.19** −0.04 0.00 0.04 0.01 0.19** −0.20** −0.28** −0.14* −0.12 0.11 0.12 0.32* −0.10 −0.08 0.02 0.20 0.00
18. Unhappy/gloomy/sad 0.14 0.27 0.22 0.60 0.08 0.01 0.00 0.07 −0.03 −0.01 0.07 −0.09 0.10 0.01 0.05 −0.18** −0.08 −0.15* −0.14* 0.23** 0.10 0.09 −0.12 0.15 −0.03 0.07 −0.05
19. Irritated/annoyed/angry 1.02 0.76 0.96 1.03 0.07 0.01 −0.11 0.12 0.12 −0.06 −0.02 −0.16* −0.02 0.06 0.22** −0.42** −0.51** −0.34** −0.32** 0.11 0.26** 0.17** 0.01 −0.23 −0.14 0.00 −0.11
Contextual factors
20. Location of work (0 = remote work, 1 = work in the office) 0.14 0.28 0.14 0.35 −0.19** −0.07 −0.06 0.01 0.04 −0.13* 0.00 −0.05 −0.07 −0.11 0.03 −0.02 −0.05 −0.08 0.10 −0.10 −0.02 −0.06 0.04 0.06 0.14 0.09 −0.10
21. Own guilt 2.53 1.46 2.88 2.09 −0.09 −0.08 0.02 −0.06 −0.07 −0.18** −0.11 0.25** −0.02 0.10 −0.13* 0.20** 0.28** 0.07 0.13* −0.09 −0.02 0.05 −0.17** −0.06 0.46** 0.08 0.56**
22. Negative consequences 3.24 1.26 3.36 1.78 −0.07 −0.14* 0.12 0.00 −0.04 0.00 −0.15* 0.08 0.02 0.11 −0.04 0.08 0.03 0.00 0.07 0.02 0.08 −0.01 −0.03 0.02 0.30** 0.30* 0.22
23. Urgency 3.41 1.09 3.40 1.45 −0.01 0.12 0.22** 0.20** 0.09 0.14* −0.15* 0.09 0.12 −0.02 0.08 −0.07 0.00 −0.04 0.01 0.21** 0.08 0.07 0.07 0.00 0.08 0.31** 0.00
Learning
24. Self-perceived learning 2.92 1.23 3.09 1.75 −0.15* 0.04 0.17** −0.06 −0.06 0.03 0.06 0.28** 0.07 0.06 −0.28** 0.29** 0.30** 0.20** 0.11 −0.10 −0.09 −0.07 −0.31** −0.15* 0.36** 0.14* 0.11
Notes:

*p < 0.05, **p < 0.01, ***p < 0.001; Means and standard deviations at the person level are displayed in Columns 1 and 2; means and standard deviations at the problem level are displayed in Columns 3 and 4; correlations above the diagonal refer to person-level data (Level 2) (n = 48), with problem-level variables aggregated at the person level; correlations below the diagonal refer to problem-level diary data (Level 1) (n = 240)

Source: Authors’ own work

Appendix

Table A1

References

Abele, A.E., Stief, M. and Andrä, M.S. (2000), “Zur ökonomischen Erfassung beruflicher Selbstwirksamkeitserwartungen – Neukonstruktion einer BSW-Skala [On the economic assessment of occupational self-efficacy expectations - Reconstruction of a BSW scale]”, Zeitschrift für Arbeits- und Organisationspsychologie, Vol. 44 No. 3, pp. 145-151.

Al-Samarraie, H., Eldenfria, A. and Dawoud, H. (2017), “The impact of personality traits on users’ information-seeking behavior”, Information Processing and Management, Vol. 53 No. 1, pp. 237-247.

Andrade, O.D., Bean, N. and Novick, D.G. (2009), “The macro-structure of use of help”, in ACM SIGDOC'09 (Ed.), Proceedings of the 27th ACM International Conference on Design of Communication, pp. 143-150.

Bandura, A. (1986), Social Foundations of Thought and Action: A Social Cognitive Theory, Prentice-Hall, Englewood Cliffs, NJ.

Barnett, T., Pearson, A.W., Pearson, R. and Kellermanns, F.W. (2015), “Five-factor model personality traits as predictors of perceived and actual usage of technology”, European Journal of Information Systems, Vol. 24 No. 4, pp. 374-390.

Bates, D., Mächler, M., Bolker, B. and Walker, S. (2015), “Fitting linear mixed-effects models using lme4”, Journal of Statistical Software, Vol. 67 No. 1.

Benbasat, I. and Barki, H. (2007), “Quo vadis, TAM?”, Journal of the Association for Information Systems, Vol. 8 No. 4, pp. 211-218.

Benozzo, A. and Colley, H. (2012), “Emotion and learning in the workplace: critical perspectives”, Journal of Workplace Learning, Vol. 24 No. 5, pp. 304-316.

Bolger, N., Davis, A. and Rafaeli, E. (2003), “Diary methods: capturing life as it is lived”, Annual Review of Psychology, Vol. 54 No. 1, pp. 579-616.

Callister, R.R., Geddes, D. and Gibson, D.F. (2017), “When is anger helpful or hurtful? Status and role impact on anger expression and outcomes”, Negotiation and Conflict Management Research, Vol. 10 No. 2, pp. 69-87.

Carmeli, A. and Gittell, J.H. (2009), “High-quality relationships, psychological safety, and learning from failures in work organizations”, Journal of Organizational Behavior, Vol. 30 No. 6, pp. 709-729.

Cerasoli, C.P., Alliger, G.M., Donsbach, J.S., Mathieu, J.E., Tannenbaum, S.I. and Orvis, K.A. (2018), “Antecedents and outcomes of informal learning behaviors: a meta-analysis”, Journal of Business and Psychology, Vol. 33 No. 2, pp. 203-230.

Choi, W. and Jacobs, R.L. (2011), “Influences of formal learning, personal learning orientation, and supportive learning environment on informal learning”, Human Resource Development Quarterly, Vol. 22 No. 3, pp. 239-257.

Cleavenger, D., Gardner, W.L. and Mhatre, K. (2007), “Help-seeking: testing the effects of task interdependence and normativeness on employees’ propensity to seek help”, Journal of Business and Psychology, Vol. 21 No. 3, pp. 331-359.

Compeau, D.R. and Higgins, C.A. (1995), “Computer self-efficacy: development of a measure and initial test”, MIS Quarterly, Vol. 19 No. 2, pp. 189-211.

Cuyvers, K., Donche, V. and van den Bossche, P. (2016), “Learning beyond graduation: exploring newly qualified specialists' entrance into daily practice from a learning perspective”, Advances in Health Sciences Education, Vol. 21 No. 2, pp. 439-453.

Devaraj, S., Easley, R.F. and Crant, J.M. (2008), “How does personality matter? Relating the five-factor model to technology acceptance and use”, Information Systems Research, Vol. 19 No. 1, pp. 93-105.

Dörner, D. and Wearing, A.J. (1995), “Complex problem solving: toward a (computersimulated) theory”, in Frensch, P.A. and Funke, J. (Eds), Problem Solving: The European Perspective, Lawrence Erlbaum, Hilldsdale, pp. 65-99.

Edmondson, A. (1999), “Psychological safety and learning behavior in work teams”, Administrative Science Quarterly, Vol. 44 No. 2, pp. 350-383.

Edmondson, A.C. and Lei, Z. (2014), “Psychological safety: the history, renaissance, and future of an interpersonal construct”, Annual Review of Organizational Psychology and Organizational Behavior, Vol. 1 No. 1, pp. 23-43.

Enders, C.K. and Tofighi, D. (2007), “Centering predictor variables in cross-sectional multilevel models: a new look at an old issue”, Psychological Methods, Vol. 12 No. 2, pp. 121-138.

Eraut, M. (2000), “Non-formal learning and tacit knowledge in professional work”, British Journal of Educational Psychology, Vol. 70 No. 1, pp. 113-136.

Eraut, M. (2004), “Informal learning in the workplace”, Studies in Continuing Education, Vol. 26 No. 2, pp. 247-273.

Eraut, M. (2010), “Knowledge, working practices, and learning”, in Billett, S. (Ed.), Learning through Practice: Models, Traditions, Orientations and Approaches, Springer, Dordrecht, Heidelberg, London, New York, NY, pp. 37-58.

Feng, B. and MacGeorge, E.L. (2006), “Predicting receptiveness to advice: characteristics of the problem, the advice-giver, and the recipient”, Southern Communication Journal, Vol. 71 No. 1, pp. 67-85.

Frazier, M.L., Fainshmidt, S., Klinger, R.L., Pezeshkan, A. and Vracheva, V. (2017), “Psychological safety: a meta-analytic review and extension”, Personnel Psychology, Vol. 70 No. 1, pp. 113-165.

Haemer, H.D., Borges-Andrade, J.E. and Cassiano, S.K. (2017), “Learning strategies at work and professional development”, Journal of Workplace Learning, Vol. 29 No. 6, pp. 490-506.

Harteis, C. (2022), “Research on workplace learning in times of digitalisation”, in Harteis, C., Gijbels, D. and Kyndt, E. (Eds), Research Approaches on Workplace Learning: Insights from a Growing Field, Springer Nature, Cham, pp. 415-428.

Harvey, J.-F., Johnson, K.J., Roloff, K.S. and Edmondson, A.C. (2019), “From orientation to behavior: the interplay between learning orientation, open-mindedness, and psychological safety in team learning”, Human Relations, Vol. 72 No. 11, pp. 1726-1751.

Heinström, J. (2005), “Fast surfing, broad scanning and deep diving”, Journal of Documentation, Vol. 61 No. 2, pp. 228-247.

Hökkä, P., Vähäsantanen, K. and Paloniemi, S. (2020), “Emotions in learning at work: a literature review”, Vocations and Learning, Vol. 13 No. 1, pp. 1-25.

Hox, J.J., Moerbeek, M. and van de Schoot, R. (2018), Multilevel Analysis: Techniques and Applications, 3th ed., Routledge, New York, NY.

Jeong, S., Han, S.J., Lee, J., Sunalai, S. and Yoon, S.W. (2018), “Integrative literature review on informal learning: antecedents, conceptualizations, and future directions”, Human Resource Development Review, Vol. 17 No. 2, pp. 128-152.

Jonassen, D.H. (1997), “Instructional design models for well-structured and ill-structured problem-solving learning outcomes”, Educational Technology Research and Development, Vol. 45 No. 1, pp. 65-94.

Kiani, K., Chilana, P.K., Bunt, A., Grossman, T. and Fitzmaurice, G. (2020), “‘I would just ask someone’: learning feature-rich design software in the modern workplace”, IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC), pp. 1-10.

Kiani, K., Cui, G., Bunt, A., McGrenere, J. and Chilana, P.K. (2019), “Beyond ‘one-size-fits-all’: Understanding the diversity in how software newcomers discover and make use of help resources”, Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, pp. 1-14.

Kooken, J., Ley, T. and de Hoog, R. (2007), “How do people learn at the workplace? Investigating four workplace learning assumptions”, in Duval, E., Klamma, R. and Wolpers, M. (Eds), Creating New Learning Experiences on a Global Scale: EC-TEL 2007. Lecture Notes in Computer Science, Springer, Berlin, Heidelberg, Vol. 4753, pp. 158-171.

Kuznetsova, A., Brockhoff, P.B. and Christensen, R.H.B. (2017), “lmerTest package: tests in linear mixed effects models”, Journal of Statistical Software, Vol. 82 No. 13.

Laver, K., George, S., Ratcliffe, J. and Crotty, M. (2012), “Measuring technology self efficacy: reliability and construct validity of a modified computer self efficacy scale in a clinical rehabilitation setting”, Disability and Rehabilitation, Vol. 34 No. 3, pp. 220-227.

Lee, F. (1997), “When the going gets tough, do the tough ask for help? Help seeking and power motivation in organizations”, Organizational Behavior and Human Decision Processes, Vol. 72 No. 3, pp. 336-363.

Lee, Y., Kozar, K.A. and Larsen, K.R. (2003), “The technology acceptance model: past, present, and future”, Communications of the Association for Information Systems, Vol. 12, Article 50.

Leiß, T.V., Rausch, A. and Seifried, J. (2022), “Problem-solving and tool use in office work: the potential of electronic performance support systems to promote employee performance and learning”, Frontiers in Psychology, Vol. 13, p. 869428.

Littlejohn, A. and Margaryan, A. (2014a), “Introduction: technology-enhanced professional learning. Mapping out a new domain”, in Littlejohn, A. and Margaryan, A. (Eds), Technology-Enhanced Professional Learning: Processes, Practices and Tools, Routledge, New York, NY, pp. 1-13.

Littlejohn, A. and Margaryan, A. (2014b), “Technology-enhanced professional learning”, in Billett, S., Harteis, C. and Gruber, H. (Eds), International Handbook of Research in Professional and Practice-Based Learning, Springer, Dordrecht, pp. 1187-1212.

Littlejohn, A. and Pammer-Schindler, V. (2022), “Technologies for professional learning”, in Harteis, C., Gijbels, D. and Kyndt, E. (Eds), Research Approaches on Workplace Learning: Insights from a Growing Field, Springer Nature, Cham, pp. 321-346.

Liu, W. and Xiang, S. (2018), “The positive impact of guilt”, Leadership and Organization Development Journal, Vol. 39 No. 7, pp. 883-898.

Loderer, K., Pekrun, R. and Lester, J.C. (2020), “Beyond cold technology: a systematic review and meta-analysis on emotions in technology-based learning environments”, Learning and Instruction, Vol. 70 No. 4, p. 101162.

McCrae, R.R. and Costa, P.T., Jr. (1987), “Validation of the five-factor model of personality across instruments and observers”, Journal of Personality and Social Psychology, Vol. 52 No. 1, pp. 81-90.

McCrae, R.R. and Costa, P.T., Jr. (1999), “A five-factor theory of personality”, in Pervin, L.A. and John, O.P. (Eds), Handbook of Personality: Theory and Research, The Guilford Press, New York, NY, pp. 139-153.

Marsick, V.J. and Watkins, K. (2015), Informal and Incidental Learning in the Workplace, Routledge, Abingdon, New York, NY.

Mornata, C. and Cassar, I. (2018), “The role of insiders and organizational support in the learning process of newcomers during organizational socialization”, Journal of Workplace Learning, Vol. 30 No. 7, pp. 562-575.

Morrison-Smith, S. and Ruiz, J. (2020), “Challenges and barriers in virtual teams: a literature review”, SN Applied Sciences, Vol. 2 No. 6, p. 179.

Newell, A. and Simon, H.A. (1972), Human Problem Solving, Prentice-Hall, Englewood Cliffs, NJ.

Newman, A., Donohue, R. and Eva, N. (2017), “Psychological safety: a systematic review of the literature”, Human Resource Management Review, Vol. 27 No. 3, pp. 521-535.

Nezlek, J.B. (2001), “Multilevel random coefficient analyses of event- and interval-contingent data in social and personality psychology research”, Personality and Social Psychology Bulletin, Vol. 27 No. 7, pp. 771-785.

Noe, R.A., Clarke, A.D. and Klein, H.J. (2014), “Learning in the twenty-first-century workplace”, Annual Review of Organizational Psychology and Organizational Behavior, Vol. 1 No. 1, pp. 245-275.

Noe, R.A., Tews, M.J. and Marand, A.D. (2013), “Individual differences and informal learning in the workplace”, Journal of Vocational Behavior, Vol. 83 No. 3, pp. 327-335.

Novick, D.G. and Ward, K. (2006), “Why don't people read the manual?”, in ACM SIGDOC'06 (Eds), Proceedings of the 24th annual ACM international Conference on Design of Communication, pp. 11-18.

Novick, D.G., Andrade, O.D. and Bean, N. (2009), “The micro-structure of use of help”, in ACM SIGDOC'09 (Eds), Proceedings of the 27th ACM International Conference on Design of Communication, pp. 97-104.

Ohly, S., Sonnentag, S., Niessen, C. and Zapf, D. (2010), “Diary studies in organizational research. An introduction and some practical recommendations”, Journal of Personnel Psychology, Vol. 9 No. 2, pp. 79-93.

Özbek, V., Alnıaçık, Ü., Koc, F., Akkılıç, M.E. and Kaş, E. (2014), “The impact of personality on technology acceptance: a study on smart phone users”, Procedia - Social and Behavioral Sciences, Vol. 150 No. 2, pp. 541-551.

Pekrun, R. and Stephens, E.J. (2010), “Achievement emotions: a control-value approach”, Social and Personality Psychology Compass, Vol. 4 No. 4, pp. 238-255.

Pekrun, R., Goetz, T., Frenzel, A.C., Barchfeld, P. and Perry, R.P. (2011), “Measuring emotions in students’ learning and performance: the achievement emotions questionnaire (AEQ)”, Contemporary Educational Psychology, Vol. 36 No. 1, pp. 36-48.

Pond, R.S., McCool, M.W. and Bulla, B.A. (2021), “Multilevel modeling of interval-contingent data in neuropsychology research using the lmerTest package in R”, Journal of Pediatric Neuropsychology, Vol. 7 No. 3, pp. 102-112.

Post, C. (2012), “Deep-level team composition and innovation”, Group and Organization Management, Vol. 37 No. 5, pp. 555-588.

Rausch, A. (2013), “Task characteristics and learning potentials – empirical results of three diary studies on workplace learning”, Vocations and Learning, Vol. 6, pp. 55-79.

Rausch, A. (2014), “Using diaries in research on work and learning”, in Harteis, C., Rausch, A. and Seifried, J. (Eds), Discourses on Professional Learning: On the Boundary Between Learning and Working, Springer, Dordrecht, Heidelberg, New York, London, Vol. 341 No. 3, pp. 361-366.

Rausch, A., Schley, T. and Warwas, J. (2015), “Problem solving in everyday office work – a diary study on differences between experts and novices”, International Journal of Lifelong Education, Vol. 34 No. 4, pp. 448-467.

Rausch, A., Seifried, J. and Harteis, C. (2017), “Emotions, coping, and learning in error situations in the workplace”, Journal of Workplace Learning, Vol. 9 No. 5, pp. 374-393.

Rausch, A., Goller, M. and Steffen, B. (2022), “Uncovering informal workplace learning by using diaries”, in Kyndt, E., Paloniemi, S. and Damsa, C. (Eds), Methods for Researching Professional Learning and Development: Challenges, Applications, and Empirical Illustrations, Springer, Cham, pp. 43-70.

R Core Team (2022), “R: a language and environment for statistical computing”.

Reio, T.G., Jr. and Callahan, J.L. (2004), “Affect, curiosity, and socialization-related learning: a path analysis of antecedents to job performance”, Journal of Business and Psychology, Vol. 19 No. 1, pp. 3-22.

Rintala, H., Nokelainen, P. and Pylväs, L. (2019), “Informal workplace learning. Turning the workplace into a learning site”, in McGrath, S., Mulder, M., Papier, J. and Suart, R. (Eds), Handbook of Vocational Education and Training, Springer, Cham, pp. 1-14.

Russell, J.A. (1980), “A circumplex model of affect”, Journal of Personality and Social Psychology, Vol. 39 No. 6, pp. 1161-1178.

Saucier, G. (1994), “Mini-Markers: a brief version of Goldberg's unipolar big-five markers”, Journal of Personality Assessment, Vol. 63 No. 3, pp. 506-516.

Savolainen, R. (2014), “Emotions as motivators for information seeking: a conceptual analysis”, Library and Information Science Research, Vol. 36 No. 1, pp. 59-65.

Škerlavaj, M. and Dimovski, V. (2006), “Social network approach to organizational learning”, Journal of Applied Business Research, Vol. 22 No. 2, pp. 89-98.

Smith, M.U. (1991), “A view from biology”, in Smith, M.U. (Ed.), Toward a Unified Theory of Problem Solving: Views from the Content Domains, Routledge, New York, NY, London, pp. 1-20.

Snijders, T.A.B. and Bosker, R.J. (2012), Multilevel Analysis: An Introduction to Basic and Advanced Multilevel Modeling, 2nd ed., SAGE Publications, London, Thousand Oaks, CA, New Delhi, Singapore.

Spering, M., Wagener, D. and Funke, J. (2005), “The role of emotions in complex problem-solving”, Cognition and Emotion, Vol. 19 No. 8, pp. 1252-1261.

Tynjälä, P. (2008), “Perspectives into learning at the workplace”, Educational Research Review, Vol. 3 No. 2, pp. 130-154.

Tynjälä, P. (2013), “Toward a 3-P model of workplace learning: a literature review”, Vocations and Learning, Vol. 6 No. 1, pp. 11-36.

Tynjälä, P. and Häkkinen, P. (2005), “E-learning at work: theoretical underpinnings and pedagogical challenges”, Journal of Workplace Learning, Vol. 17 Nos 5/6, pp. 318-336.

van der Rijt, J., van den Bossche, P., van de Wiel, M.W.J., de Maeyer, S., Gijselaers, W.H. and Segers, M.S.R. (2013), “Asking for help: a relational perspective on help seeking in the workplace”, Vocations and Learning, Vol. 6 No. 2, pp. 259-279.

Venkatesh, V. (2000), “Determinants of perceived ease of use: integrating control, intrinsic motivation, and emotion into the technology acceptance model”, Information Systems Research, Vol. 11 No. 4, pp. 342-365.

Venkatesh, V. and Bala, H. (2008), “Technology acceptance model 3 and a research agenda of interventions”, Decision Sciences, Vol. 39 No. 2, pp. 273-315.

Vu, T., Bennett, D. and Ananthram, S. (2022), “Learning in the workplace: newcomers’ information seeking behaviour and implications for education”, Studies in Continuing Education, Vol. 45 No. 2, pp. 1-20.

Weller, I. and Matiaske, W. (2009), “Persönlichkeit und Personalforschung. Vorstellung einer Kurzskala zur Messung der ‘Big Five’ [Personality [personality and personality research. Introduction of a short scale for measuring the ‘big five’]”, German Journal of Human Resource Management: Zeitschrift Für Personalforschung, Vol. 23 No. 3, pp. 258-266.

Wickham, H., François, R., Henry, L. and Müller, K. (2022), “Dplyr: a grammar of data manipulation: R package version 1.0.10”.

Willson, R. and Given, L.M. (2020), “I'm in sheer survival mode’: information behaviour and affective experiences of early career academics”, Library and Information Science Research, Vol. 42 No. 2, p. 101014.

Wüstenberg, S., Greiff, S. and Funke, J. (2012), “Complex problem solving — more than reasoning?”, Intelligence, Vol. 40 No. 1, pp. 1-14.

Zhang, M. and Jansen, B.J. (2009), “Influences of mood on information seeking behavior”, April 2009, New York.

Zhao, B. (2011), “Learning from errors: the role of context, emotion and personality”, Journal of Organizational Behavior, Vol. 32 No. 3, pp. 435-463.

Acknowledgements

The authors declare no conflicts of interest.

Erratum: It has come to the attention of the publisher that the article, Leiß, T.V. and Rausch, A. (2023), “Informal learning from dealing with software-related problems in the digital workplace”, Journal of Workplace Learning, Vol. 35 No. 9, pp. 291-310. https://doi.org/10.1108/JWL-03-2023-0042, incorrectly listed references to three articles cited in the article as being authored by Rausch when the text should have read Rausch et al. and references an article as being authored by Leiß when the text should read Leiß et al.

This error was introduced in the editorial process and has now been corrected in the online version. The publisher sincerely apologises for this error and for any inconvenience caused.

Corresponding author

Tamara Vanessa Leiß can be contacted at: leiss@bwl.uni-mannheim.de

Related articles