Archive | Uncategorized RSS feed for this section

Industry-funded degrees – a novel idea?

17 Apr

Fast food chain KFC made the news this week for its committment to match-funding a BA in Management for 60 of its employees. Employer-funded training is not a new phenomenon: many organisations, particularly in the business and engineering sectors where the specialist skills contribute to business objectives, support their employee’s development to Foundation Degree level.

KFC have committed to matching the employee’s costs for tuition and accommodation: purportedly costing them around £9,000 a year per student. What are the implications of this for KFC’s management students and for Higher Education as a whole? The rest of this post summarises some of my ideas on the subject.

The degree is currently being offered by just one university. This is not unusual for employer-funded training: in apprenticeships and Higher Education employers tend to form partnerships with one or a few institutions who are happy to integrate into the general content of the course the specific training the company wants its employees to receive. While this potentially has implications for the usefulness of the qualification to employees in the wider labour market, the regulation of academic qualifications in the UK and Europe  exists to prevent the content becoming too specific to a single employer.

One clear implication of the university-employer partnership is students’ loss of choice of institution: KFC’s management students will all study with the same university if they wish to make use of the match-funding. This is significant given the emphasis on choice in today’s education ‘market’. However it may also widen participation in Higher Education, engaging employees who might not have considered university due to lack of access to the necessary funding or the required grades for entry. This programme circumvents the rise in fees that has been the topic of much recent discussion, and the organisation’s link with the university delivering the course may well reduce the entry requirements for the students it refers. So, although this potentially undermines the idea of choice and competition that has dominated the sector, its impact on widening participation might be one to be welcomed.

It is difficult to draw conclusions on this specific example without further detail about how precisely it will work. But hwever else it is viewed, this move is representative of a wider, gradual integration of Higher Education and industry, along with increasing pressure for universities to focus on action research which has a direct impact on local or national industrial, economic or social objectives. As already mentioned, employer involvement in and funding of education is not new or ‘novel’: Foundation degrees, vocational training and even Masters Degrees have long received elements of employer funding. Certainly it will frustrate those with a ‘traditional’ view on Higher Education, who continue to emphasise the value of enquiry for its own sake and for its capacity to prompt innovation and truly new ideas – perhaps this is likely to be the biggest loss of our increasingly instrumental education system.

My big TEL question

4 Apr

Just how important is technology to teaching and learning?

 

As I look at this question, two things strike me. Firstly, it’s a very big question! I hope the ocTEL course might help me to establish where I stand on the issue. This leads to my second observation: it might appear from this question that I dislike, or doubt the value of learning technology.

This is not the case. I love new technologies: I love finding out how new tools work, what others believe they can offer and what I feel I can do with them. But exciting new tools are nothing without effective teaching practice and this, at times, is forgotten in the techno-joy which drives us TEL-enthusiasts. It’s easy to be excited about the ‘power’ or capacity of a new technology, but are we sometimes guilty of going straight into questions about how to get academics and students to engage with it, before we properly consider its actual pedagogical value? Earlier today I read this article on the ‘power’ of Twitter to reach people all over the globe: this is something I love about all forms of social networking, but if ‘power’ implies the capacity to effect change I fail to see how this aspect of Twitter is intrinsically empowering, particularly in an educational context.

So my question should perhaps be this: what do learning technologies allow us to achieve in teaching and learning that could not be achieved without?

For me, the debate on the value of TEL ties in with broader social narratives about the importance of technology in our everyday lives. In education, the discourse goes that we must use new technologies that our students increasingly expect: yet at the same time we discuss ways to improve staff and students’ information literacy, so this cannot be the whole story. It seems we are simply living in an age where, for a multitude of complementary reasons, technology is being increasingly integrated into all aspects of human life, including education. The fact that we do not fully understand this process is, I believe, justification for returning regularly to my big question.

A ‘Brave New World’ of Higher Education?

1 Apr

I’m reading lots of discussions at the moment about whether there is a need to change the way that Higher Education is delivered – this invariably linked to the idea  of technology enhanced learning (TEL). On the one side of this debate is fervent emphasis upon the potential of technology to transform teaching and learning, and the need for a new approach to meet the needs and demands of a new generation of student-consumers. On the other, the traditionalists dismiss both the idea of a need for a new approach, and the capacity of new technologies to add value: the former is moral panic and the latter a gimmick.

But whatever your viewpoint this whole debate suggests a need to make some kind of momentous decision. But how can this be? Social change is gradual. These debates are becoming mainstream precisely because the change is happening – gradually and not necessarily in the directions we imagine and talk about now – but of course happening. Universities are changing the way they attract, and teach, students: more traditional, red-brick institutions might be more resistant to this change, but are inevitably following the innovative trends set by the competition.  This is driven by the perceived expectations of students which are, in turn, shaped by the attitude of the universities, as well as state rheroric internalised from the assertions of school teachers and careers advisors that a degree is the way into a good job. The fact that students now purchase Higher Education supports the idea that they are doing so for tangible future gain.

We also live in a time where social attitudes and political discourse favour ‘networking’ and knowledge transfer over more ‘traditional’ didactic teaching. ICT has a hand in this as it introduces the idea that the imperative becomes not ‘knowing’ a subject inside-out, but knowing where and how to access different information about it. Knowledge is shared between business and academia, experts from different disciplines and students and their teachers. Academics no longer teach students a set of facts, but are responsible for encouraging them to enquire for themselves.

Is this so new? For a long time we have valued teaching that goes beyond rote learning – project-based learning in the fields of science, technology and engineering undermines the sense that this is a new phenomenon. Students of the arts have long been encouraged to make and defend their own interpretations of the subject matter. So what change does increasing use of ICT in teaching represent? Seemingly little more than experimentation with new tools to continue this trend of innovative teaching – whereas the fear of TEL perhaps links more tangibly to distaste for all things new: the changing place of Higher Education in society, and the new tools which have become available to support teaching and learning. The suggestion that these are intrinsically linked is an exaggeration, and one which justifies this fear rather than facing it.

Why this MOOC is so different to anything I’ve ever done… it’s liberating, but confusing

10 Feb

I pride myself in being pretty familiar with, and capable of, independent study. I’m a full-time undergraduate student, doing this course as an ‘added extra’, purely for my interest in the topic; and before this I spent a year learning with the Open University alongside my full-time job. I’m used to balancing competing pressures, I know to look for answers to questions before asking my teacher, and I understand the value of sharing and developing knowledge collaboratively. After two weeks of the course I am still struck by an awareness of the sheer scale of it; which I think comes through the number of different people I don’t know communicating through so many different platforms and means, but I have become accustomed to this and have found the ways to communicate with fellow students that work best for me.

There is one thing, however, which I have struggled to get my head around; something that crystallised in my mind when reading the thread ‘Where are the professors?‘ in the course discussion board. I have been merrily posting, tweeting, blogging and discussing the resources for the two weeks with whoever has the patience to listen, read or comment; but for the first time in my ‘formal’ educational experience I do not have to be aware of what my lecturer thinks or is looking for, the course aims or assessment criteria. I am writing my own opinions of the resources and the course as a whole because I am not bound by the curriculum, or a need to demonstrate any particular knowledge or skills in my writing. Never before, even in formative learning, have I felt this free to interpret learning material in my own way; rather than to a prescribed framework.

And the result? Well, I’m having a great time, for starters. It’s really liberating not to have to prove my understanding of a given conceptual framework, cite specific theorists or memorise dates. I am aware that to an extent I am doing this implicitly, in order to make sense of what I have seen and read, but I feel so much more free to interpret this in my own way. What I don’t know (and what someone in the above thread quickly picked up on) is whether what I’m coming up with is any good. Am I poorly informed? Have I completely missed the point? How will I know if I have? Are my opinions franky facile and boring? (Quite possibly.) So while I enjoy the freedom of this model, I have no established set of expectations against which to judge myself. This is strangely disconcerting.

Clearly this offers me the opportunity to learn to express myself freely; and hopefully to judge what is and is not ‘valuable’ knowledge (whether my own, or someone else’s). … But how will I know when I have achieved this?!

Intentions or tools? A technological determinist view of the future

4 Feb

The second week of the E-Learning and Digital Futures has us looking to the future, and if the short films which form part of the week’s resources are anything to go by we have very little to look forward to!

On the positive side, we are offered a shiny, clean-edged utopian vision of one possible future by Microsoft in Productivity Future Vision, where glossy, intuitive tablet technology makes everyday tasks and communication so easy we barely notice the tech is there at all. This is made possible perhaps by Corning’s versatile glass (A Day Made of Glass), which plays a central, if sophisticatedly understated role in bringing the benefits of streamlined technology to the office, healthcare, education and family life alike. The technology is beautiful in its simplicity, its subtlety and the way blends neatly into our lives. Perhaps most notably, these two films demonstrate how the technology fits into our everyday lives, and how we can choose to use it or not: in Productivity Future Vision the girl is able to share recipes with her mother over a video link, before downloading the chosen one and then beginning to bake the cake by hand. Given the futuristic nature of the film Microsoft could have shown her 3D printing it, but this perhaps is a step too far for us and would undermine the comfortable way in which technology integrates with our lives.

Sight turns this notion on its head: there is agency in this film too, but it is exercised by one human for the domination of another, in a fairly disturbing fashion. Interestingly, my first reaction to this film was one of horror at the way the victim is made so vulnerable by the technology; but actually people are able to achieve the same end in today’s society, using different tools. The intention itself is repugnant, but the way it is enacted is shocking partly because the method is so alien.

In both Sight and Productivity Future Vision technology is in fact no more than a tool, and portrayed as such; but the actions of those using it totally skew our perceptions of the technology itself. We associate the action: homely mother-daughter conversation and cake-baking or horrendous exploitation of another human being, with the tool used to achieve it. Is this because it is easier to blame the means than the human actor? Or because these human actions, good and bad, are familiar to us; while the technology is not?

So, am I a digital native?

30 Jan

As a mature undergraduate student and (fairly recently converted) technology enthusiast, I find myself wondering whether I fall into Prensky’s (2001) ‘digital immigrant’ category of grown-up professionals for whom technology is essentially a second language, or ‘digital native’ category of young students fluent in its complex syntax. I love technology, although I found this love late on, and am still learning. I see its potential to change the ways in which we approach problems, connect with others, and generate and share ideas: I have just started an E-Learning and Digital Cultures MOOC to develop these skills.  I remember the internet being new and wondering what on earth anybody would use it for. Now I use it every day. I tweet, I message, I search and I tag. I also read books, including books about technology, and I listen, in person, to other people telling me things they know. All of these things work, in the right context. I get excited about new technologies, sometimes purely for their own sake. But I don’t intrinsically understand how they work. Nor do I see why new approaches should automatically replace old ones. So, am I a digital immigrant or a digital native?

The idea of the digital native, by now widely critiqued by academics, teachers and learning technologists alike, appears to rest on the assumption that there exists some watershed between ‘new’ and ‘old’; and that we all fall to one side or the other of it. But viewed from a social determinist perspective (Dahlberg, 2004), Web 2.0 was simply another step, albeit a large and highly significant one, in the incremental development of communications technology which started with the telegram. Attention to the social context of the digital native also highlights another set of issues. No student comes to rely upon new technologies without consistent, ready access to them: and in no society is access to expensive devices equal. Individual preferences and social influences shape people’s choices of devices and platforms, in which situations to use them, and whether to use them at all. The availability of a technology does not imply a whole cohort of society will inevitably become familiar with it, or choose to use it. Poorly-informed use of technology in teaching and learning then, in a desperate attempt to capture the imagination of the mythical digital native, risks alienating students who feel the particular tool is irrelevant to the subject. Perhaps more worryingly, it also risks leaving behind the already disadvantaged who potentially stand to benefit the most from education.

I learned to use the internet to email, shop, research, and communicate from the age of about fourteen onwards. If I had started aged five, or fifty, I would still have had to learn. When I encounter a new tool, program or platform I need to learn how it works. I have been aware of technology becoming increasingly intuitive: this was, and is, a gradual process which reflects the achievements of the developers rather than my implicit understanding. My point is that we are all, in one way or another, digital immigrants. We are not born knowing how digital technology works, but grow accustomed to it as it adapts to us. As Dalhberg (2004) argues, attention to the social, historical, individual and technological elements of innovation is the only way to understand how techology has been shaped by us, just as we learn to adapt to it. The utopian ideal of the digital native is, then, a convenient way to avoid thinking too hard about how we might adapt technologies to serve educational purposes: this most certainly is not what our students need.

Dahlberg, L (2004). Internet Research Tracings: Towards Non-Reductionist Methodology. Journal of Computer Mediated Communication, 9/3. http://jcmc.indiana.edu/vol9/issue3/dahlberg.html

Prensky, M. (2001). Digital Natives, Digital Immigrants. On the Horizon, 9/5.http://www.marcprensky.com/writing/prensky%20-%20digital%20natives,%20digital%20immigrants%20-%20part1.pdf