What expectations should we have of technology (hereafter, tech) in the work of higher education? What is the proper place of tech? What should developers do about tech?
Good tech has at least these three related qualities. It just works. It does, or it can, make things better. And after a while it becomes almost invisible, almost unproblematic.
Creative disruption, then improvement, then a further step into cyborgia
With a new tech – whether it is new to the world, new to the institution or discipline, or new to the group or individual – there is an initial period of excitement and learning, sometimes accompanied by fear. During this period we discover the range of things that we can do with the new tech that we couldn’t previously do, or could do only with some difficulty, or less well, without it. We explore which of these new possibles we want to, and should, and can, use. Well managed and supported, this can be a time of creative and productive disruption.
And then the tech almost vanishes into us and our organisations, becomes embedded in our practice and in our thinking. We have become in one more sense tech-enhanced humans and organisations, as we did when we first wore glasses or contact lenses, or rode a bicycle, or drove, or travelled by aeroplane, or built a building. We have advanced a little further into the condition of cyborgia.
The ‘almost’ is important. Hopefully we are still at least a little conscious, when for example we telephone, of what we are doing. For example, as we make the call, hopefully we are sensitive to the risk of intrusion, aware of the person called’s context, prepared at least a little for whatever the call may bring us both. But the fact that we can often speak to someone without visiting them, through a system of vast and invisible complexity, is now, in the moment, for most people, relatively unproblematic.
How does this relate to the technologies of our work? What expectations do we have of the tech in use? Here are a few of my answers. You might find it useful to spend a few seconds noting what (else) you expect from the tech-supported people with whom you work.
We expect …
- We expect to be able to compose and then send a message, perhaps with attachments in various media, to (very very nearly) everyone we know professionally, and maybe also personally. We are confident it will be in their inbox within minutes. We expect to send this message this without having to remember their contact details, just their name. No envelope or stamp required.
- Building on this, we expect to be able to communicate with similar ease with defined groups and subsets of the people with whom we work.
- We expect to be able to find, within seconds, contact information for someone we don’t yet know.
- We expect to find at least a half-way useful answer, or at least a starting point to an answer, to an increasing number of questions, of growing complexity, by typing the question into a search engine.
- We expect ourselves, and those with whom we communicate, to write in language that is grammatically correct and correctly spelt, at least according to the views of our software provider.
- We expect those with whom we work to be able to locate and make critical, intelligent, appropriate use of (a) information at which we point them and (b) information of particular interest and use to them which they find for themselves; and then we expect them to make and share connections and relationships between information from these two kinds of sources.
- Beyond literacy, beyond competence, beyond capability, we might expect or hope for a degree of fluency. Fluency in working with words and numbers and images and ideas appropriate to our disciplines and our professional and personal life. Fluency also in using the technologies through which these various elements of academic and professional work and life are more and more often created and manipulated and communicated and read and studied and used.
We shall expect …
You might find it even more interesting to cast an eye 50 years and more into the future, and begin to consider what expectations are reasonable, for the current students within our universities, throughout their working and personal lives. Maybe I shall have a go at this in a future post, remembering that, as Neils Bohr ”Prediction is very difficult, especially about the future.” Note that he did not say ‘impossible’.
We won’t be able to teach our students at University all the necessary skills for their next 50 or more years. All we can do is help them become able, keen and confident to learn; of course selectively and critically; whatever new tech they want to / need to learn. Because the great majority of the tech will continue to become easier to learn and use. Whatever we may think about markets, the market should at least achieve this.
Note that these current and future expectations cannot neatly be separated out into our expectations of the tech and our expectations for individuals, although some of the expectations may be tech-led and others more people-led. They are all expectations of individuals using the tech.
In the next post I shall test these expectations against our current experiences and realities of using the tech, and explore the places of tech in higher education.
In previous posts in this series I have explored relationships between originality, education and learning, and ways in which originality can be developed. If you’re starting here, welcome, and you may find it useful to at least skim these previous posts. In the last of this series, for now at least, I shall explore the big one, the relationship between originality and knowledge.
Why do I call this the big one?
The academic world reveres knowledge. Research is valued as the production of knowledge. Teaching is often described (and also experienced) as the transmission or handing on of knowledge. Expertise involves (not exclusively) having knowledge. Experts are people who know a lot. This academic view and valuing of knowledge is reflected in the popular domain, where quizzes mostly value knowledge, much less often valuing the ability to reason, solve problems, or make connections, seen in exceptions such as The Krypton Factor and Brain of Britain.
How does originality relate to this very high value placed on knowledge?
Originality and the development of new knowledge
One way is through the role of originality in the development of new knowledge. This is an often mysterious, hidden, hard-to-describe process, even for those who develop such new knowledge. And even when the process is described; sometimes very vividly, as in Kekule’s account (see http://tinyurl.com/Kekule) of realising that a possible structure for benzene could be a six-carbon-atom ring, rather than a string – this idea came to him through a vivid daydream of a snake eating its own tail.
Do such accounts offer help for those who would create new knowledge? I think so. Such stories, perhaps with the hero narrative played down a little, suggest the value of letting imagination run free, allowing wild images to form and then checking what implications the images may have for the problem at hand.
We find an important link here between originality and knowledge, through a scientific method in which hypotheses, models, explanations can be developed through any process at all, then tested rigorously for their predictive or explanatory power.
Creating as well as testing hypotheses
It may be that current education places a little too much emphasis on the rigorous testing of hypotheses, and not enough on generating the hypotheses in the first place. This imbalance may in turn draw a picture of science and technology, and perhaps other disciplines involving some element of critical analysis – hopefully, then, most disciplines –as more procedural, more knowledge-stuffed, and less welcoming of originality, than they actually are. A route here to making many disciplines more attractive, to a wider range of students; perhaps, also, to making them more fun, and maybe even more productive?
This does not mean a lowering of standards. Only ideas that survive tough tests will become accepted as valued knowledge. The academy is safe.
Originality valued as the development of hypotheses for testing can also bring to life the sometime empty rhetoric of constructivist approaches to learning, by being explicit about what are being constructed – hypotheses – and saying how these hypotheses will be used.
I realise, or hope, that there are vast differences between different disciplines in these respects.
Perils of over-emphasising knowledge
I sometimes fear that over-emphasis on knowledge; whether propositional (know what), procedural (know how), or conceptual / theoretical (know why); may tend to drive out originality. But before that: there is a hierarchy of valuings of knowledge. The language of education shows clearly how propositional and conceptual / theoretical knowledge are valued over know-how. The UK Minister of Education has made this utterly explicit very recently – http://tinyurl.com/GoveKnow. Know-how is usually referred to as skill, and generally has lower status than knowledge. (Events can re-balance our view of this. As an eye surgeon replaced my somewhat cloudy lens with a shiny new plastic one earlier this week I was hugely more concerned with her skill than her knowledge, much though I also value the latter. Actually I was unconscious at the time, but you know what I mean.)
A race to the bottom
Knowledge on the page or the screen looks so certain, does it not? The first, natural, thing for a learner to do with knowledge on a page seems to be to try to learn it. Teachers, valuing what they know, have a corresponding tendency to teach it. The players having variously taught it and learned it, the next obvious thing is to assess it, to find out if it has been learned. Propositional knowledge consists mainly of – well, propositions. Conceptual / theoretical knowledge similarly consists of concepts and theories. And all of these tend to be taught and learned as stuff. The pathology of this is relatively easy to explain. Learning becomes memorising. Memorised knowledge is relatively easy to assess. However ambitious we are. And the sheer quantity of knowledge out there, sifted through the quality-assuring processes of refereeing and review, is enough to fill and over-fill any course we could design. Obviously, we must teach more. Because there is more to know. This is a kind of race to the bottom, not because knowledge is unimportant, but because, increasingly, it isn’t enough
Our concerns about originality
Also, I suspect that we are ambivalent about originality. I suggested in earlier posts typologies of originality, from local and (on a separate axis) perhaps not world-changing (“I had that idea, though it may well be flawed, and others may well have had it before.” to both global and world-changing originality (a version of e=mc2 in 1905). Thinking about originality may push us to reflect critically on the nature and extend of our own originality, reflection which we may not may not always find encouraging.
And anyway originality is hard to assess, is it not? Particularly if we are assessing local originality, where there may be an inverse relationship between knowledge and originality – the less I know, the more locally original ideas I may have.
The normal academic instinct, I think, at this point, is to let knowledge trump originality, to say “You should have known that.” rather than “Well done for having that idea.”
I feel, on balance, that this generally is an unhelpful stance for a teacher to take. Why?
A changing relationship between knowledge and originality
Knowledge is becoming much more readily accessible. The machines have replaced much manual work. They are now replacing more and brain work, progressively leaving the more and more difficult and more rewarding work to us. The relationship between Moore’s Law of progress in the power of computers and their ability to do some of the difficult stuff we do (such as, of course, being original) may nor may not be linear. But there will be some positive correlation, now and into the future.
But, however this plays out, I’m pretty sure that originality in graduates and academics will continue to become a more and more important and valued ability. Of course our graduates will still know a lot. But their knowledge will increasingly be a side-product of their ability to be critically original, working with and shaping the technology, and accessing and using the knowledge, selectively and critically, when they need it.
If you’re starting here, I suggest you scan the three previous posts on originality
Becoming critically original
Of course, originality needs to become an increasingly critical originality. The particular critiques, and more broadly the critical approaches, will need to be developed by lecturers, by students alone and with their peers, and in conversation between students and lecturers.
The lecturer’s skill lies in getting the nature and weight and progression of their responses to student work right. Not treading on students’ dreams, nor dishonestly flattering under the badge of being sensitive, but rather steadily demonstrating and practising and discussing an increasingly critical and informed approach to work and study, which includes being explicit about the rationale for their critical comments.
Tempting clichés about steel and fires will be resisted. But the students need to test their original ideas against a growing range of the literature, against the establish corpus of knowledge and (depending on the subject they are studying) perhaps also against their own experience and data.
This need not be a discouraging experience. The students will come to enjoy and value both the critical and the creative parts of critical originality. They will find the unexpected satisfaction which can follow from laying aside (perhaps with a sigh) an idea they have developed which is not supported by further reading and evidence. And they will find delight in, from time to time, confirming that their new idea has some strength and validity; has some explanatory, even predictive, power; and deserves to be taken further forward. Also, they will learn not to be discouraged when they find someone has got there before them. Local originality is not a failure of global originality. Rather it is a step on the long road that may lead to global originality.
And so on through the careers of an academic.
Becoming a professor would not be the only happy ending to this story. Being critically original is a capability and an approach to work that is valued within and well beyond the University.
But if such critical originality is to be a goal of education, as well as an aspiration, we need to take it seriously, to be explicit about it, and to explain and illustrate in our own work what it can mean. We need to give students opportunities to develop their critical originality, and to receive feedback on their attainment of it. Students’ critical originality needs to be developed within the discipline being studied, although students may welcome the chance to apply the approach to other areas.
And we need to assess it in clearly valid ways. Generically, that might involve students undertaking some work that is at least locally original; critiquing the work; and identifying and making a reasoned case for the nature and extent of it originality. This will play out differently in different subjects.
But first we need to be clear what originality and critical originality can mean, within and beyond the subject. As I have attempted to do here. I’d love to know if any of this helps.
The author claims this post to be locally original. He is conscious that he has read about this topic and related topics over the years. He is therefore confident that the post uses ideas previously read and mostly forgotten. He has also chosen to omit ideas that might have been relevant. He makes no claim to global originality. He hopes some of the ideas will be useful – utility is not the same as originality. He feels better for thus having made the status of this post clear.
In the next and possibly for now final post on originality I shall explore relationships between originality and knowledge.
If you’re starting here, I suggest you scan the two previous posts on originality
So how do we help our students become appropriately original?
The account in the previous post may suggest one way. Teach them more and more content; teach them to engage with the content, to critique and use it. And, perhaps, a few of them will become professors.
Of course, something is missing from this account. Originality does not automatically follow from the accumulation of knowledge, even from persistent active engagement with knowledge. Indeed, accumulation may on a bad day bury a flickering originality under the weight of content. Originality; alongside other academic, disciplinary and professional qualities; also needs to be encouraged and supported and rewarded and valued. From day one.
Helping people to become original
How do we help people become original? From the start of their studies:
- We explicitly value originality.
- We talk with (not to) our students about what originality means, and why it matters, in the particular disciplines they are studying.
- We disentangle local from global originality, perhaps using some of the ideas from earlier posts.
- We make originality into a learning outcome for their programmes of study – “Students will be able to go beyond what they had been taught and read, and come up with ideas, suggestions, explanations, possibly even theories and models which are (in the sense used in these blogs) at least locally original.”
- Or we make originality one of the criteria against which their work will be assessed.
- We encourage students to critique their own work, with reference to, among other qualities, its originality.
- We reward originality, with attention and then with marks and grades.
- We provide many opportunities and much encouragement for students to develop and demonstrate originality.
- Then, throughout the course of their studies, we encourage them along the spectrum from local towards more global originality, in part by teaching them how to engage with the wider literature of the subject, and in part by helping the become more (and justifiably) confident in their originality.
I need to say more about this last point. There is a wide-spread view of the process of learning. It is rarely made explicit, but it is often clearly visible in the structure of our courses, our teaching, our assessment. This view says that, first, we learn the content. Then, as a later step, we learn to critique it, apply it, be original in it.
I don’t think this view is accurate, for reasons I shall probably return to in another post. But, for now, I’ll say two things about the relationship between engaging with the wider literature of the subject and being and becoming original:
- Good course design and good teaching encourage students to see the literature, not as tablets of stone, but as an evolving set of more or less original ideas and understandings, each building on and then going beyond some previous work. Originality should be one lens through we which we read, study and made sense of the literature. We can do this by encouraging our students (and ourselves) to analyze the links and relationships between papers, to identify the particular originality of a publication and how it relates to predecessors. This will help us and our students to see the structure of the discipline or profession – perhaps structure is a bit static, better perhaps to see how the discipline or profession moves, develops.
- Once this more active approach to the literature becomes habit, our students can seek out and review the literature by asking how the literature relates to their own recent and perhaps at least locally original work. This inverts normal relationship between student work and the literature. It puts the student’s work, and in particular the possible originality of the student’s work, centre stage. It asks students to explore how the literature supports or refutes the student’s ideas. This requires a student to take their own work and their own local originality, seriously. It helps them to act, and thereby to see themselves, as a scholar, as a proto-member of the discipline or profession, rather than as a dependent and naive supplicant.
I said in my first post on originality that I wasn’t going to talk about the quality of the newly originated idea. But quality is obviously important here. So in the next post I’ll give attention to a powerful route to increasing the quality of ideas. I’ll talk about critical originality.
Introduction – originality (you can safely skip this if you already read the previous post)
I suggested a way to think about the question ‘how original?’ I suggested a scale. At one end is what we may call local originality, as in “I had never seen or heard that idea until I expressed it”. At the other end, global originality, the author claims “That idea has never before been thought in the history of the world – or not in a way accessible to me, anyway.”
So originality in practice may be relative, rather than absolute – relative to what is known, by the writer and also of course by the reader.
How does this play out in practice, in education?
The student and the professor
In week one of their studies, we might hope that a student is already describing accurately, explaining clearly, critiquing in a considered way and applying appropriately what they are learning. (Indeed, we might feel that ‘describing accurately, explaining clearly, critiquing in a considered way and applying appropriately’ is at least part of a fair account of the results of ‘learning’.)
We might also hope that the student is beginning to make their own sense of what they are learning. This may involve them offering some ideas that go a little beyond what they are reading and being taught. They may show a very local, cautious, necessarily under-informed, but nonetheless thoughtful, originality of ideas and understanding. In some cases, through thus going beyond, they may become better able to predict accurately or act effectively. We should hope that this local originality would grow throughout their studies, perhaps at some stage becoming less local, alongside other necessary academic, disciplinary and professional qualities.
By contrast, the student’s professor may be producing a paper which offers a globally original advance in knowledge and understanding, an advance that is informed by and builds on the professor’s global knowledge of previous relevant work and adds new data. (Different disciplines have very different accounts of what research and new data mean. I hope that a version of this account works for most disciplines.)
Thus far I have talked about originality both as a claim about and as a response to the thought or idea offered. But it is also useful to think about originality as a process; the process of being original.
Is there any useful sense in which the student and the professor described above are both being original? I think, yes. Both are going beyond. Both are moving out from what they already know, what they can access, acknowledge and reference. Both are creating; with differing amounts of confidence and significance; knowledge or understanding that is new. New to themselves at least, and hopefully in the case of the professor also new to a wider, perhaps global, audience within their discipline.
From both student and professor we should expect an appropriate degree of reference to current knowledge. But appropriate, of course, means something very different for our student and our professor. The professor, obviously, would know and understand and be able to analyse and critique and interpret and use and reference much more knowledge than the student.
Something about becoming original, and helping people to become original, next time.
What does originality mean? How do we recognise it?
I’ll concentrate on originality in thoughts and ideas expressed mainly in speech or writing, because I don’t know enough about originality in, for example, visual arts. Some of the ideas here may also work for visual arts.
Why does originality matter? Because we say that we value originality, in the work of students and of academics. It will be easier to enact this belief if we are clear what we mean by originality.
This post is not about plagiarism. I shan’t mention Turnitin beyond this once. I am exploring originality and its development and judgement in academic settings.
There will be a few posts on this. For this one I’ll focus on some meanings of originality.
Versions of originality
I can see local and global versions of originality. (Local and global are end zones of a spectrum.)
In the local version of originality, original means original to the producer of the thought or idea, as in “I had never seen or heard that thought until I expressed it.” The cautious would add “… as far as I know”. We can sometimes forget the sources of some of our ‘original’ ideas. Or maybe that only happens to me.
Local originality may still engender and deserve strong positive feelings and respect, in the minds of author and readers alike. And deservedly so, given the intellectual effort, the reading and study and thought and critical analysis and synthesis that may have gone into creating and then enjoying the idea.
In the universal version, original may mean “That thought has never before been thought in the history of the universe.” Perhaps we should stick to ‘global’ rather than ‘universal’, for now at least, not to be over-ambitious.
This global account of originality may still be unreasonably global. ‘Global originality’ might require only that the idea had not previously been published in a form and location which was reasonably accessible to the current author of the thought.
What about the quality of the idea? This may include the elegance, acceptability, power, or other valued characteristics of the idea. I mention this only to exclude it from further discussion here. It is very important, but not my current focus. Maybe another time.
Originality in claim and response
There’s a distinction here that I need to make explicit. As an author, when I offer a paper for presentation or publication, I surely make a claim for the originality of some ideas in the paper, ‘original’ based on what I already know. And as a reader or reviewer, I respond to, judge, a claim for originality, again based on what I already know, in my level of expertise. (I’ll return to ’expert’ and ‘expertise‘ in later pots on originality.) In practice, originality, whether as a claim or as a response to a claim, seems to be knowledge-dependent, to be relative. Which surprises me.
Next I’ll probably look at the development and expression of originality, and at being original. But I’ll welcome your comments so far.
Thank you to participants in the Edinburgh Napier University writing retreat at St Andrews in June for the discussion about originality in academic work, by students and by staff, that prompted this blog post and the ones that will follow on the subject. You really made me think, about originality and also about writing. There’s nothing like teaching something to show you what you do and don’t know about it.
We can – we should – prepare ourselves for magic. Here’s why. Here’s the magic. Then, here’s how.
It’s in the numbers:
As we know, Moore’s Law reported and predicted, with good accuracy; in 1965; that the number of transistors on a chip doubles every two years. Many other computing technologies and quantities follow a similar law. That’s a factor of over 30 million to 2015.
Story: This matches my own computing experience well enough. My 1984 Sanyo MBC555 has 640K of RAM; my 2010 Dell 9200 has 4 GB – (both cost around £1000). This is a RAM scale factor of around 6,000 over 28 years. Moore suggests a factor of 16,000 over this period. Close, on a log scale – around 3 years off.
It’s in the experience:
Hegel noted that a quantitative change over time can become a change in quality. C:\ to GUI; Sputnik 1 to communications satellite; Galileo (telescope) to Hubble (telescope). These are obviously not Hegel’s examples – he died in 1831.
Combine Moore’s Law with Hegel’s Law and we see that we are getting qualitative change from straight technological advance.
Yes, but where does the magic come from?
Again, it’s in the experience.:
Clarke – Any sufficiently advanced technology is indistinguishable from magic. Magic is certainly qualitatively different from everyday experience, whether it’s the playing card chosen secretly by me being identified correctly by the magician or the iPhone 4S that, on a good day for Siri and the user, and if Siri and user are both in a tolerant mood, with a good internet connection and no head cold, can pass a gentle version of the Turing test (it responds like a person so it may be a person).
Moore, Hegel, Clarke: magic. We need to be preparing for magic.
What will the world be like? We know quite a lot.
The world is moving towards everything everyone everywhere everywhen.
Everything? First, everything existing as bits: data, becoming information, becoming knowledge, becoming judgement, becoming intelligence, becoming wisdom, becoming – whatever, for you, lies beyond wisdom. Transcendence? Whatever. This progression is already part of our evolving experience and expectation.
Example: From ‘Here’s the train timetable, make of it what you will’ to ‘You are an eight minute walk from the station, this is the route, your train home goes from Platform 6 at 1750 and it’s on time and your seat is A24 and you can get a meal at your table’. You’ll have your own examples.
This progression, this quest, is harder than the Moore’s Law progression; much harder. But Moore’s Law is a steady supporter. So is our growing smarts, ‘our’ now meaning us and the technology working together.
Example: We used to have to specify every process, every line of code. Now, increasingly, we can say to the system ‘Here’s roughly what we want, now go play, go learn, and come back when you’ve cracked it. And let us know about any other good ideas you spot while you’re playing and learning’.
The synergies are kicking in. It is increasingly clear that intelligence requires knowledge as well as computing power. And the knowledge is becoming more available, in part because more of it is being produced already digital / taggable / searchable. But the next level will always be hard.
I won’t say much about everything as atoms. 3D printing moves us along the road to the everything of atoms, perhaps connected by flows of bits. We may find we can do more with plastic than we currently realise. We are learning to3D print other materials.
Everyone? Yes, everyone, whether as source – “Anyone out there know how I can…?” or as audience “You may enjoy this…” or as collaborator – “If you do this bit and I do that bit and we put them together like this…”, even for a moment. Everyone, that is, whom we can find, or who can find us: and everyone who wants to. Finding people gets easier. We’re maybe not quite as good yet at permissions management. But we will learn.
Everywhere? Everywhere we and those we allow to have power over us want!
Everywhen? (A more common word is ‘always’, but I wanted to keep the ‘every’ riff going.) Ditto as for ‘everywhere’ above.
Everything everyone everywhere everywhen needs a filter and a pause control and a volume control and (although less and less likely by the month) maybe even an off switch.
The rest is detail. Fascinating detail; but detail.
Use the new stuff – a new app isn’t a marriage, it’s a flirt, or, if you really get on, a fling.
Ready fire aim; or just fire, and see what happens. Be critical afterwards, not before.
Kids learn faster than we do because they play more. Be a kid. Dignity isn’t so important. Learning is.
Find good sources of advice and information, follow them ‘til they become less good, then ditch them for better. There’s always someone better. It may, in time, be you.
Ask what will happen, what will become possible when, not if. Examples. Write your own. Concentrate on what you want. What will happen, what will life be like, when:
- Everything: My computer knows what I’ll need next – not what my cookies think I should want next – and with the odd nice surprise – and has them waiting for me – and doesn’t get upset, but rather learns, when it’s wrong. And when I never have to do exactly the same thing more than once ever again?
- Everyone: I can readily find and contact the very best person for me to talk to about x. Or their avatar?
- Everywhere: I don’t have to decide which device with which information to take where to do what, but rather My One™ does and knows all I need – a thinking Dropbox for my life?
- Everywhen I want!?
Not forgetting filter, pause control, volume control. OK, no off switch.
Then, maybe, when you’ve predicted what will come and decided what you want, seek it. It may already be out there. Your future is sometimes already someone’s present.
Story: Several years ago, cluttered by and guilty about unused back-up devices, I thought it must be possible to do this online. So I searched for ‘online backup’; found several services; chose one; used it; migrated to a cheaper one a couple of times – currently, Carbonite, as long as they stay good and cheap. Oh, and bought a 2TB hard drive – full circle!)
If it’s not already out there, and you’ve a mind to, invent the magic.
Magic is not spoilt by knowledge
Just because you understand some of it, the movement of which it is a part, its technologies, doesn’t stop it being magic. Magic is how you feel about it. My MBC555 was magic. It made new and amazing things possible. It changed my world. It improved my world. So will my iPad 5.
And anyway, where do you want to be – in the front row of the stalls gasping in amazement, or up on stage being amazing? I recommend some of both.
At least we know the broad outlines within which the future will happen. The numbers rise; quantitative change becomes qualitative; magic happens; and we move towards everything everyone everywhere everywhen.
Of course, the genetic engineering / synthetic biology magics may not fall within this framework at all. And we have to not wreck everything in the process of advance. We may all go down together in the best toyshop ever. We need to make the magic work for much bigger goals than an iPad 5. Real magic means doing the right things, not doing the wrong things better. We’re probably smart enough. Are we brave enough?
More about everything everyone everywhere everywhen
Then, what does this mean for higher education?