Is it possible that much of the engineering education research community, myself included, has misunderstood the notion of competency? With many others, I think, I was unaware of literature drawing attention to some of the mistakes that can easily be made when talking about competency. I conclude by suggesting a way forward, beyond ‘competencies’.
How did I reach this position?
Russ Korte and I recently completed a chapter for a book on engineering education research. We set out to explain how research studies of engineering practice could inform educators. We looked for research evidence that could help us connect education with practice and did not find much. The fragments we located were hardly encouraging, suggesting that major curriculum changes to develop young engineers with a deep understanding of sustainable design did not make much difference in the workplace.[1]
When we widened our search beyond engineering education, we stumbled onto the smouldering embers of an academic debate from closing decade of the 20th century: Should curriculum be defined by dictating content or student outcomes? These debates led to the contemporary notion of competencies in education.
Human resources psychologists had already fought over this terrain: Should recruiters focus on the content of a job (job analysis) or candidate characteristics (behavioural competencies)? Any recruiter who finds a reliable way to predict career performance can turn that into a goldmine. Management recruiters, in particular, pay handsomely for psychometric tests that claim to do that. In the end, competency proponents won, perhaps because competencies relieved recruiters from the challenge of knowing a lot about the jobs they were trying to fill. Job Analysis, the preceding paradigm, was swept aside in the rush to anoint behavioural competencies such as emotional intelligence is the holy grail of recruiters.[2]
Education bureaucrats looking for quick policy wins weren’t far behind. What counted, when it came to education they claimed, was what students could do, not what they could recite in examinations.[3]
A few researchers pointed out discomforting contradictions.
They pointed out that competency statements always rely on hidden, implicit, unstated knowledge. Gerard Lum provided this nice illustration:
Bill cut the cake
Teresa cut the grass
Cleo cut the cloth
In all three sentences, the word ‘cut’ has the same meaning. However, most of us would not consider Bill’s performance ‘competent’ if he used Teresa’s lawnmower. This simple illustration demonstrates implicit knowledge that most of us share, without necessarily saying so. For a competent performance, Bill would have used a cake knife and Cleo a sharp pair of scissors. Lum and others showed that, no matter how detailed the competency statement, there is always more that needs to be defined. The idea that we can fully define competent performance to another person is an illusion.
Paul Hager distinguished three concepts: (a) performance attributes (competencies); (b) the underpinning constituents of competent performance; and (c) the education, training or development of people to become competent performers.[4] Problems appear when we equate performance outcomes with skills and capabilities. Performance outcomes specify activities and tasks, not knowledge and skills. Linking the two is a huge inferential leap that is vague and imprecise.
Unfortunately, cries for caution were crushed in the education reform stampede, led by Australia and the UK. Other OECD countries were in hot pursuit. These changes led to the ABET EC2000 reforms.
20 years later, employers still complain about the same graduate weaknesses as they did through the 20th century.[5] What happened? How can we explain why so much reform effort failed to move the dial on graduate performance? Or at least perceptions of graduate performance?
In compiling the influential 2007 OECD DeSeCo report[6] the two Swiss authors noted one of the stumbling blocks. They pointed out that
“A competence is defined as the ability to meet individual or social demands successfully, or to carry out an activity or task… This demand-oriented definition needs to be complemented by a conceptualization of competencies as internal mental structures – in the sense of abilities, capacities or dispositions embedded in the individual. Each competence is built on a combination of interrelated cognitive and practical skills, knowledge (including tacit knowledge), motivation, value orientation, attitudes, emotions, and other social and behavioural components that together can be mobilized for effective action. Although cognitive skills and the knowledge base are critical elements, it is important not to restrict attention to these components of a competence, but to include other aspects such as motivation and value orientation.”
Unfortunately, many of us who read their report jumped to the conclusion that competencies are combinations of knowledge, skills, abilities, attitudes, or just KSAs.[7]
Along with that emerged another assumption, that competencies are the foundation of successful professional practice.[8]
In the relatively closed world of engineering education research, we were oblivious to work of people like Hager and Lum who had already exposed conceptual weaknesses that would greatly limit the reach of the competency movement.
Inferring the level of competence, and the knowledge, skills and attributes that enable a person to perform competently, to ‘cut the cake’, relies on background knowledge of the context in which the work is performed.
Recent research illustrates the influence of background knowledge. For example, when learning written communication skills, students model their instructors, preferring written communication with complicated sentence structures: “making it sound professional”. Professional engineers, on the other hand, write short and simple direct sentences to reduce spaces for misunderstandings.[9]
Does this mean that the notion of competency is of little or no use?
I don’t think so. Competency as an attribute of performance can tell us about the progress made by an individual along the journey towards mastery. Learning is a process and competency assessment can help us measure progress. A golf handicap indicates player competency. All golfers hope to achieve gradual reductions in their handicap as their competence on a particular course improves.
Competencies are relevant in engineering practice.[10] However, they are broad, rely extensively on background knowledge and many have multiple aspects. An individual might have reached a professional level of performance in one aspect, yet might still need a supportive work environment and supervision to be capable of competent performance in another aspect.
So, how can we move beyond competencies to improve engineering education?
While research publications and teaching performance continue as the primary selection criteria for engineering faculty staff, it is unrealistic and unreasonable to demand that instructors also have recent experience with commercial engineering practice.
An engineering curriculum necessarily emphasizes engineering science and mathematics. Therefore, with no knowledge of practice, students imagine that engineering work is technical, and demands a high level of maths proficiency and science fundamentals.
Research reveals three groups of performances in engineering work:
Group 1: hands-on interactions with tools, measuring instruments and objects;
Group 2: cognitive interactions with abstract object representations, including analysis, modelling, programming, design, and technical problem-solving;
Group 3: socio-technical interactions with other people to collaborate in, and coordinate group 1 and 2 performances, also to advocate for particular requirement interpretations and resources.
Research has demonstrated that professional engineering work, even for novice engineers, predominantly consists of group 3 performances with much less time spent on Group 2 performances. Few professional engineers perform significant hands-on work. Most engineers identify group 2 performances as “real engineering”, while reluctantly conceding that group 3 performances represent most of their work.
This can be disconcerting for engineering educators because, at first sight, it seems to challenge the priority they attach to knowledge of engineering science and maths proficiency. It is also disconcerting for engineering novices. Just yesterday, a young civil engineer with five years of experience told me “What I do every day is 180 degrees different from my studies!”
Research studies provide insights that explain why socio-technical interactions take so much time. Engineers rely on distributed expertise[11] beyond their personal knowledge, most often arranging for others with the required expertise to contribute skilled collaborative performances. That’s much faster than learning for oneself. Knowing who to ask is the key to bypassing technical problems.[12] Engineers also spend much of their time evaluating technical interpretations that mutate in the minds of people implementing solutions as they work around regulatory, environmental, safety, expertise, logistical, and financial constraints. The effects of these constraints may be unfamiliar for the engineers who conceived the original technical intentions. Engineers use technical expertise to identify where implementation teams may be diverging too far from the original intentions. They need analysis capabilities to assess consequences to decide whether to advocate for corrective actions. So, in amongst the countless socio-technical interactions every day, in meetings, by email, text messages, video calls, reports and other information systems, engineers are called upon to make rapid, consequential technical decisions, often based on imprecise information.
The first task for educators, therefore, is to infer the educational objectives, the required capabilities from engineering work performance descriptions. The second task is designing educational activities likely to help students develop those capabilities.[13]
Well-established instructional design processes can help with the second task (see suggestions below). Unfortunately, the performance descriptions are missing. Without this, the first task is impossible.
We do have detailed research studies of engineering practice in several countries and industries, perhaps 200 or more. However, it is entirely unreasonable to expect engineering faculty to read even this relatively small collection of research literature on top of all the other expectations we place on them. Therefore, what we urgently need is a compact, concise, and compelling description of professional engineering work that is accessible for time-poor engineering instructors with little understanding or experience.
Such a concise description of engineering work might help resolve tensions in the education community between those advocating for ‘generic’ skills and others arguing for mastery of technical fundamentals. This tension is evident in most of the accreditation discussions that I have experienced.
So, to move beyond competencies, we first need a description of engineering work written for time-poor engineering instructors. That’s an interesting task for engineering practice researchers more accustomed to 10,000 word papers written for social science audiences. Engineering students can help by studying the work of recent graduates. They also need a research-based intellectual framework to help them identify practices they might otherwise dismiss as irrelevant.
Studies that identify the ways that education influences practice would help. It’s challenging because the practice environment greatly influences the degree of competence attainable by an individual. For example, most engineering graduates learn little about engineering standards in their courses. Therefore, on graduation, few if any graduates could claim any competence in applying engineering standards relevant for their field of practice. However, a graduate commencing with a firm that provides close supervision for early-career engineers would quickly gain competence because the supervisor can show the graduate where to start and how to apply the standard effectively in the firm’s context.
We found no research evidence linking competency assessments with workplace performance by early-career engineers. The evidence we found highlighted the significance of group performances rather than individual contributions, pointing to the need to study both engineers and the organizational contexts in which they work. Maybe we looked in the wrong places.
Given these knowledge gaps, engineering practice researchers have some challenging investigations ahead. You’re welcome to add some comments.
Acknowledgements
I am very grateful for discussions with Russell Korte and Honor Passow that contributed ideas for this short paper.
Suggested Instructional Design Literature (from Honor Passow)
For an approachable book on instructional design (not much coverage of assessment, but the other instructional design elements are well done):
Design for How People Learn (Voices That Matter) 2nd Edition, by Julie Dirksen; New Riders; 2nd edition (December 7, 2015) ISBN-10 : 9780134211282; ISBN-13 : 978-0134211282
A more formal book with all the processes named with the terminology of instructional design:
The Essentials of Instructional Design: Connecting Fundamental Principles with Process and Practice (4th edition); by Abbie H Brown & Timothy D. Green; Publisher:Routledge; 4th edition (September 18, 2019); ISBN-10 : 1138342602 ISBN-13 : 978-1138342606
For a definitive source on how to conceptualize competencies with a stunning tabular organization:
Anderson, L. W., Krathwohl, D. R., Airasian, P. W., Cruikshank, K. A., Mayer, R. E., Pintrich, P. R., . . . Wittrock, M. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York, NY: Longman.
For an approachable book on how people learn:
Make it Stick: the Science of Successful Learning (2014) by Brown, Roediger and McDaniel, Publisher: Belknap Press: An Imprint of Harvard University Press; 1st edition (April 14, 2014) ISBN-10 : 0674729013 ISBN-13 : 978-0674729018
[1] For example, (Buch, 2016)
[2] Shippmann (2000) provides a major assessment by the American Psychological Association.
[3] Hager and Beckett (1995) provide background on the education reforms.
[4] (Hager, 2004, p412-3)
[5] See for example (Brunhaver, Korte, Barley, & Sheppard, 2017)
[6] (Rychen & Salganik, 2002, p8)
[7] (Male, Bush, & Chapman, 2009). In her PhD thesis (2010) she noted that several studies had adopted a similar notion of competencies associated with knowledge, skills and other attributes. Later the notion of competencies as combinations of knowledge, skills and abilities had strengthened as in Passow and Passow (2017, p477) Competencies are “the knowledge, skills, abilities, attitudes, and other characteristics that enable a person to perform skillfully (i.e., to make sound decisions and take effective action) in complex and uncertain situations such as professional work, civic engagement, and personal life.”
[8] Widely adopted by professional accreditation agencies, and considered to be foundation for practice in literature (e.g.Passow & Passow, 2017, p477)
[9] (Conrad, 2017)
[10] E.g. (Male, 2010; Passow & Passow, 2017). An example from Male is “Communicating clearly and concisely in writing (e.g. writing technical documents, instructions, specifications)” and from Passow: “Organize, analyze, and interpret data. Question the validity and reliability of measures. Explore trends and anomalies.”
[11] (Trevelyan, 2010)
[12] (Korte, Sheppard, & Jordan, 2008)
[13] (Gerard Lum, 2004, p496)
References
Brunhaver, S. R., Korte, R. F., Barley, S. R., & Sheppard, S. D. (2017). Bridging the gaps between engineering education and practice. In H. Salzman & R. Freeman (Eds.), US engineering in a global economy (pp. 129-163). Chicago: University of Chicago Press.
Buch, A. (2016). Ideas of holistic engineering meet engineering work practices. Engineering Studies, 8(2), 140-161.
Conrad, S. (2017). A Comparison of Practitioner and Student Writing in Civil Engineering. Journal of Engineering Education, 106(2), 191-217. doi:https://doi.org/10.1002/jee.20161
Hager, P. (2004). The competence affair, or why vocational education and training urgently needs a new understanding of learning. Journal of Vocational Education & Training, 56(3), 409-433. doi:10.1080/13636820400200262
Hager, P., & Beckett, D. (1995). Philosophical underpinnings of the integrated conception of competence. Educational Philosophy and Theory, 27(1), 1-24. doi:10.1111/j.1469-5812.1995.tb00209.x
Korte, R., Sheppard, S. D., & Jordan, W. (2008, June 22-26). A Qualitative Study of the Early Work Experiences of Recent Graduates in Engineering. Paper presented at the American Society for Engineering Education, Pittsburgh.
Lum, G. (1999). Where’s the Competence in Competence-based Education and Training? Journal of Philosophy of Education, 33(3), 403-418. doi:https://doi.org/10.1111/1467-9752.00145
Lum, G. (2004). On the Non‐discursive Nature of Competence. Educational Philosophy and Theory, 36(5), 485-496. doi:10.1111/j.1469-5812.2004.085_1.x
Male, S. (2010). Generic engineering competencies required by engineers graduating in Australia. (PhD). The University of Western Australia, Perth, Western Australia. Retrieved from https://research-repository.uwa.edu.au/en/publications/generic-engineering-competencies-required-by-engineers-graduating
Male, S., Bush, M. B., & Chapman, E. (2009, December 7-9). Identification of competencies required by engineers graduating in Australia. Paper presented at the Australasian Association for Engineering Education Annual Conference, Adelaide.
Passow, H. J., & Passow, C. H. (2017). What Competencies Should Undergraduate Engineering Programs Emphasize? A Systematic Review. Journal of Engineering Education, 106(3), 475-526. doi:10.1002/jee.20171
Rychen, D. S., & Salganik, L. H. (2002). Definition and selection of competencies (DeSeCo): Theoretical and Conceptual Foundations Strategy Paper. Retrieved from https://www.voced.edu.au/content/ngv%3A9408
Shippmann, J., Ash, R., Carr, L., Hesketh, B., Pearlman, K., Battista, M., . . . Sanchez, J., I. (2000). The practice of competency modelling. Personnel Psychology, 53(3), 703-740.
Trevelyan, J. P. (2010). Reconstructing Engineering from Practice. Engineering Studies, 2(3), 175-195. doi:10.1080/19378629.2010.520135
Comments (From Facebook AAEE Group)
Roger Hadgraft: Thanks James for such insights. Maybe learning engineering is like learning a sport or a musician instrument, only much harder! Everyone needs to learn the basic skills of a game or a musical instrument and then they need to practise a lot, sometimes with others (team sport, orchestra). Then they need plenty of individualised feedback about their performances. I think it’s not that competencies aren’t well intentioned, it’s just that most of them are at the level of how to serve (the ball), not how to win the game, or even what the game is. The trouble with engineering is that it is so amazingly diverse that to define higher order competencies becomes increasingly difficult and a rapidly moving target in every industry. However, getting students to play the game, with industry and academic mentors, seems to be the best that we can do at the moment. Perhaps what we need to change is to get student (novice) engineers into the workplace asap rather than expecting them to spend 4-5 years on campus first. Hats off to CSU (Euan Lindsay) for showing how to do this. We need to get back to the 70s when this was a much more common practice.
Scott Wordley: Roger Hadgraft more direct industry exposure and practice is one approach. Or we help students to create their own mock workplace at the university: this is what student teams do.
Roger Hadgraft: Scott, yes, we definitely need both these approaches. The hard part is to give each student quality feedback about their performance. We don’t want to spend the money on it.
Thanks for your thoughtful comments, Roger Hadgraft and Scott Wordley. Reading the literature on work-integrated learning, which is similar to what you’re advocating, one runs into problems when ideas of practice imparted (tacitly) by instructors differ significantly from the realities of practice. See, for example, Ajjawi, R., Tai, J., Huu Nghia, T. L., Boud, D., Johnson, L., & Patrick, C.-J. (2020). Aligning assessment with the needs of work-integrated learning: the challenges of authentic assessment in a complex context. Assessment & Evaluation in Higher Education, 45(2), 304-316. doi:10.1080/02602938.2019.1639613. Just exposing students to practice does not work if the students are pre-disposed to see practice in ways that conflict with reality. Here’s an example. “What did you do in your practicum?” “I constructed a mathematical model of a process plant component from data.” “What part did your supervisor play?” “Oh, he was in meetings all the time or on the phone. He just left me on my own to get on with the work.” “Did you ask him if you could accompany him and sit in on the meetings?” “Nah, it was all admin bullshit anyway.” This student came away with the idea that engineering work is mathematical modelling, in part because this idea developed from experiences in university. Therefore, I am arguing that educators need a clear idea of practice before they can help students acquire more accurate notions of practice. In my analysis, exposure to practice will not shift students’ ideas much. There’s no easy ways to help educators reach that point just now.
Incidentally, in the 1970s, far more engineering instructors had recent practice experience. Also, most students found their first jobs in government engineering institutions that served as ‘practice schools’ and labour pools for private companies who creamed off the best when they needed to hire engineers. That’s no longer an option.
LikeLike
From David Dowling by Facebook
Thanks James, Sally and Roger. The wheel turns. Some memories from my education in surveying and civil engineering in the 60s and 70s.
1. The surveying staff at RMIT had to work at least 1 day a week in a surveying practice.
2. Night classes in professional level engineering programs were offered by many institutions in Melbourne (RMIT, Swinburne, Caulfield etc.) for those who worked full-time and studied part-time. Probably more than half engineers studied that way.
3. Surveyors had to do 2 years as an articled pupil after completing degree during which time they learnt and practiced at every level of surveying team (chained, technician and surveyor). This meant they understood all the roles and could employ and train people in those roles. Similar training arrangements were used for engineers in the government departments I worked in or liaised with.
Some comments from my recent experiences.
1. Compulsory residential schools and survey camps provided opportunities for us to observe and question the practices of students. They often used methods that were OK but were in reality were short cuts based on assumptions. The students often did not know the correct methods, the theory behind them, or the assumptions the short cuts were based on. One presumed their boss knew all this but failed to pass it on when training staff.
2. With the focus now on research the majority of academics have not worked in industry and therefore do not have the understanding of practice, nor the skills to be able to teach and assess these aspects.
3. The majority of academics do not have the time to assess competencies (particularly higher level ones) in these times of mass education and tight budgets. That is unless they they go above and beyond and sacrifice their research or life balance.
LikeLike
Hans Tilstra via Linked In
Hans Tilstra
Curriculum Advisor at Keypath Education
Every university and accrediting body will ensure their students meet competency/capability/proficiency standards. However, once that box is ticked, what would nudge a university to go ‘beyond’?
Using Beetham and Sharpe’s four types of pedagogies, universities lean heavily on associative pedagogies. To be fair, engineering faculty in particular are relatively good at embedding constructive pedagogies as well.
In the pedagogical mix, the financially most challenging are the situative pedagogies. These are expensive, whether offered as work-integrated learning (WIL) or work experience in industry (WEI).
One admirable example is Queens University’s Engineering Design and Practice Sequence https://edps.engineering.queensu.ca/about-the-edps/ .
I often wonder what would happen if students in such a practice ‘strand’ would make co-authored research a normal part of their studies. What topics (Drawdown? SDGs?) could be investigated by students in such a way that their work contributes to a larger data set?
(appended diagram from appendix 1 of Beetham & Sharpe – Rethinking Pedagogy for the Digital Age
LikeLike