The Naked Ape

Forty-odd years ago, Desmond Morris brought the study of humans as animals to the common man through his book, The Naked Ape. This work was the first to make me think about what it is to be a human, something which has been exercising many of us in the EDCMOOC this week.

Phaedrus’ Knife

morrisThe advancements in our understanding of the physics, chemistry and biology of living things, in particular of DNA, have allowed us to begin to appreciate that the traditional classifications of living things are in need of revision. We have developed in recent centuries the classification, naming and organisational structures which allow us to understand the connectedness of life on this planet. These have been largely derived from observational field- and laboratory-work and have been almost entirely empirical until Watson and Crick discovered the structure of DNA, the molecule of complexity common to all known life on Earth. The naming and classification of living things has been subject to the same influences as the naming and classification of non-living things.

…an intellectual scalpel so swift and so sharp you sometimes don’t see it moving. You get the illusion that all those parts are just there and are being named as they exist. But they can be named quite differently depending on how the knife moves. . .

Pirsig, Zen And The Art Of Motorcycle Maintenance

The term Human, then, has no meaning for anyone but those using the term. It’s a general description of a group of things with broadly similar characteristics distinct from another group of another name. How you choose to classify and name is – or should be – entirely up you, appropriate to the purpose. There are areas we dare not yet go, however. It is difficult to describe a black man as “a black man” without somebody wailing and shrieking that you’re a racist.

Education is a Dying Art

Steve Fuller’s Tedx talk considers what it is to be “human”. In it, he throws out the phrase, “Education is a Dying Art” in context of being human in terms of Artifice, as if somehow the thing that distinguishes humans is the notion of being something more than just enough to survive. Becoming human here implies gaining skills, knowledge and awareness of the part of the human in the collective of humanity. This is for me, society: this being the analogue of how the fish is part of the school or the ant is part of the colony. Each has a part to play which must be learned or acquired somehow and for the last few hundred years since the Industrial Revolution has become part of the organisation of society. Education has evolved within the society we are building for the purposes of making humans part of the machine that functions to sustain us all. Those who have learned to play their part function within it.

My thoughts on Fuller’s statement about education being a dying art are that formal, state or society-sponsored preparation of people to play their part in the machine of society is changing. New levels of choice are open to many, not just in the First World of post-Industrialisation, but in the new interconnected world in which access to new roles in the emerging Global Society are possible where they have never been before.

Transhumanism

3462938929_22b75204a4
CC NASA

The evolution of our society as a life-form collective which has developed the ability to extend its influence beyond that of any individual’s for purposes beyond mere survival and procreation (which is arguably the limit of DNA-mutation evolution) has reached what might be called a tipping point. There are new realisations that the collective impact of our society is potentially threatening to our survival, but these are incapacitated from mutating survival enhancements by properties of the society itself. We are able to sustain and enhance physical potential beyond what would have been possible without intervention – Steven Hawking, and many more like him – but we do so at a cost we have not yet perhaps become fully aware of in any sense that we can do something about it.

At the individual level, however, things are brighter. Those who would, a hundred years ago, have forever been denied the slightest chance of realising their potential to contribute to the collective, are now finding it possible to participate in opportunities to contribute (and benefit themselves and their families) and make a real difference to the development of all of our health and prosperity. This, through the technologies of infrastructure and communication, increasingly at our fingertips and part of us as the new humans.

The Truth is Out There

I’ve been thinking about some of the ideas and aspects of what it is to “be human” in this week’s edcmooc activities. Whilst doing that, I was prompted by a tweet on Edison’s birthday:

edison

I don’t know if you know about Edison. He is often, as the tweet suggests, regarded as a visionary man who worked hard to achieve his ends. The record shows him to have been an utterly ruthless man who went to extraordinary lengths in a battle of competing technologies to win, regardless of what the cost to other people (and animals) may have been. I leave it to you to find out how he tried to discredit Tesla’s AC power solution by, inter alia, publicly electrocuting dogs, horses and even an elephant and by inventing the electric chair as a means of human execution. You might say that the technology corrupted his sense of decency to the point where his behaviour can only be described as inhuman. Is this about technology or about being successful in business? Anyway…

Another world, just below the surface

We might see ourselves as somehow separate from the technological world with which we clothe ourselves, or at least, those of us rich enough to do so, do. The Toyota video suggests that there’s a “truth” beneath the veneer of the ordinary lives we lead, Matrix-style, that we can break out of. Kris Marshall’s Adam seems aware that his chance of retaining the modern family unity intact is going to be enhanced by phoning, rather than FakeBooking Jane.

binit1One of the reasons I like technology tools is that I can walk away from them. Take the mooc, for example. If I were enrolled in a traditional course, I’d have all kinds of logistical imperatives to keep me attending, not least the cost implications of dropping out. It’s hard to stop attending if doing so has a penalty that’s not easy to pay. The mooc, however, is easy to walk away from. If I were to do so, I would incur no cost, no embarrassment, no challenge to explain. No penalty. It’s like I could just close the browser, shut the lid and go for a pint or paint the bathroom. If push came to shove, and I felt the need to break out, my iPhone and everything else would go straight in the nearest bucket. How liberating. Why are my palms sweating? Anyway…

Retention and the emotional dynamics of video

Taking the Hersh article, then, there are what seems to me to be false arguments about why students accessing learning through a LMS like Blackboard or Moodle drop out so easily in comparison with those who have to drag themselves into lecture theatres. Video is offered as mitigation against this but I don’t think that’s it. Sure, retention might be improved by increasing social interaction but you can still walk away. Is it because the virtual learning experience is less real than the one that requires your physical presence? Is it so evidently a false experience that when the expectations aren’t realised, we can just “switch it off” with impunity? The idea that the “illusion of non-mediation” through the “emotional dynamics of face-to-face” is created by making videos of yourself is frankly ludicrous. Even people who shout at the TV know the difference between “real” and “video”.

I was told once that chess was developed as a game of battles so that real battles wouldn’t need to be fought. I’m not so sure, having experience of Uckers in the Army. Here, what’s real and what’s the analogue are hard to distinguish once the chaos of the end-game apocalypse begins. Anyway…

Asynchronicity

One of the rushes I’m getting at the moment is the realisation that this information renaissance we are living in now is transforming education, kicking and screaming. The transformation is coming about because we are beginning to learn how to cope with the contradictions of information flow in the new age. Let’s take this mooc as an example. We are all accessing the same information which are resources we all should view, read, or whatever. Because we are so diverse geographically and socially, we can’t all do this at once, so we do so asynchronously, at our own convenience and on our own terms. The tech allows this. But the mooc isn’t 40,000 (or however many are left) people independently doing the same things in isolation, it’s a community, or rather, a community of communities of shared experience who interact with each other. One of the ways we are interacting is synchronously through twitter chats and Google hangouts. These move very fast indeed and keeping up with the pace of discussion requires not only appropriate technology but also all the wits you can muster. It’s like a new level of consciousness. This is the new sh*t. New education not only is spawning new channels for students to access learning, it should be spawning new stimulus from educators (after all, that’s their role) for learners to cope with the emergent properties of these new channels. Anyway…

I’m going to abandon the technology for a couple of days and go immerse myself with a friend up in the real world at Loch Rannoch. The truly amazing thing about it is that this is a world which made itself in all its complexity and beauty out of Hydrogen. A lot of Hydrogen and a lot of time, but it did so with no engineer or designer or World Builder. That’s the truth.

 

Being human in a digital learning age

One of the reasons I’m taking e-learning and digital cultures (edcmooc) is that I’m interested in being an effective educator. The world is changing fast as technologies and channels of communication evolve and I’m interested in adapting and riding the wave of opportunity they represent. I’ve trialled things like VLEs and websites in various forms, I’ve made audio and video podcasts, played with pdfs and had students submit homeworks in any number of forms including video, audio and even labanotation. I’ve learned several things.

  1. Personalisation and choice is important
  2. People think differently and communicate differently
  3. Some things take time and effort to understand
  4. Replay is powerful
  5. Good things happen when people meet and talk
  6. Learning is possible in anarchy

These point to several crucial factors for the learner.

  1. I want to learn what I’m interested in.
  2. I want to be able to think it through, over and over if necessary, until I understand it.
  3. I need to be in control of my learning resources.
  4. I need stimulus.
  5. I may need encouragement and support.
  6. I need access to a civilised environment.

Through all of these points, I see the role of technology in terms of providing access to resources on the learners’ terms: asynchronously, in a medium he or she is comfortable with, replayable, searchable, indexable, clippable ad aggregateable. I see the role of the learner as whatever he or she needs it to be, for his or her purpose. I see the role of the teacher to provide stimulus; resource; challenge and support and to facilitate meetings – ideally real but virtual if there’s no other way – between learners of the same material who can respond to the teacher’s prompting in order to develop further learning.

This describes for me a model of learning in which there is structure, content, challenge and assessment within a very human context – socially constructivist, if you like – which is made available through the enabling channels provided by technologies. These technologies offer recording and playback, tagging and organising by the teacher and the learner.

I think this model of learning is called, “blended learning” and I think it’s here, in the room, now. You might have noticed that I have not included peer commentary here – I’m not convinced that it’s necessary although I can see that it’s helpful.

Looking to the Future and Metaphor

goopI’ve just been reviewing the “week 2 responses” digital artefacts posted by our full-time cousins, the MSc students on this course. I’m afraid they’re all a bit too arty-farty abstract for my taste (or stage of cognitive development, you decide). I feel like I’ve just eaten a bowl of what Dozer would call a…

…single celled protein combined with synthetic aminos, vitamins, and minerals. Everything the body needs.

from the Nebuchadnezzar.

Popular Cultures

gazBack in the 1970’s I was an avionics technician in Her Majesty’s REME. I remember thinking, as I was stuck in the cramped access hole under the screaming turbine and gearbox of a Westland Gazelle helicopter, balancing a 2kg AVO meter, instrument screwdriver and cables as I adjusted the flight idle busbar voltage, “wouldn’t it be nice to have some kind of head-up display device here, showing me the procedures and settings for this operation?”. I’m not claiming to have invented augmented reality, having seen head-up displays on fighter jets. I think I did see that it would be a logical application of technology to make operations like this safer and quicker.

Later, working at Boeing in Seattle, there was talk – never realised as far as I know it – of having maintenance procedures available for ground crew working in difficult places or situations, through the use of a lightweight helmet-mounted display system. How different are these ideas in principle from the visions offered by Microsoft and Corning of a future day of glass? Or indeed, the cheesy short, “Sight“?

Metaphor and learning

As a student and teacher of physics, the use of analogy, simile and metaphor is essential to developing cognitive models of the universe. When used intelligently, they are a powerful tool for understanding. Intelligent use means being aware of the device when using it – many teachers and students fall into the trap of accepting the analogue as reality. This is OK on a temporary basis such as when teaching children about electricity but when the development doesn’t proceed beyond a certain stage, people can be stuck with a seriously wrong understanding of the thing they think they know, often for the rest of their lives. The metaphor’s limitations should be understood when put to use. XKCD again:

It could be said that all learning, if you subscribe to the constructivist model of learning, is based upon building analogies or connected patterns of related things. Our brain has evolved as a pattern matcher and it makes synaptic connections very quickly between associated patterns. These patterns could be ideas, sensations, emotions: we are, after all, limited to experiencing the universe through our senses and only consciously through what our brain’s processes allows us to perceive as significant.

The Fad of the MOOC or a new MP3?

My father had a collection of long-playing records (ask an older person) called, “teach yourself code”. I still have them. These were a course which, if completed, would render the student an expert in Morse Code. He didn’t get past the first disc.

The MOOC is for me a modern analogue of those LPs. Modern, in that it represents the new internet expectation that things are free, including information. The MOOC, through structure and content, gives me information and the opportunity to acquire it in a critical and thinking way. This for me is its value.

Learning is changing. Schools in Scotland are beginning – in small numbers, but watch this space – to offer students a blend of learning experiences which include academic “traditional” school and vocational opportunities, with online courses from the likes of the Open University. This is a fantastic development which blurs – and will break – the boundary between school and life-long learning.

Visions of The Future

I’m just about to settle down and get stuck in to the week 2 material for #edcmooc which is tagged, “Looking to the Future”. Before I read anything, I’m going to make a few observations on how I see looking to the future as a purposeful activity.

crystalballsWinston Smith would no doubt recognise the doubleplusgood duckspeak of much of today’s pronouncements on education, if the future described by his creator wasn’t so utterly way off the actual outcome. Hello. Is it me you’re looking for? Lionel Ritchie was in the charts in the UK in 1984 and I, thankfully, was in Seattle doing some damned clever things for Boeing with software that you compiled overnight and stored on 20-kilo, 80MB disc packs the size of dustbin lids. I had TV-screen glasses, a moustache and a Suzuki v-twin. Some of you reading this will need to ask an older person what a dustbin lid is.

The track record for insightful forecasting of future worlds is pretty weak, in my experience. Nostradamus. Orwell. Wells. TV21 comic. Russel Grant. Mystic Meg. Jesus. Ian Smith. I have no reason to believe that the authors of the latest four visions of Scotland’s education in 2025 will be any better than those.

So with that as context, I think I’m ready for a look at what can only be regarded as the fiction of the future and as one who has been called Davros in the past 24 hours, this should be entertaining.

Copyright

I’ve been having a conversation with a couple of friends about things like moocs and blended learning and the future of education. In that dialogue, the issue of sharing content under the Creative Commons License arose – this being offered as a reasonable step to take to share resources for online learning without having somebody steal it and sell it on for profit.

My first blog was posted in 1997 and within a few months I realised that the images and content I so carefully created were being used by others to make money, either through aggregation or lazy theft of copyright material. Since that time, I have tried on a number of occasions to seek redress, either by having the content removed, or properly acknowledged and back-linked to the source, or by seeking some kind of royalty or compensation for the use of my intellectual property. Only once have I ever been successful in getting appropriate acknowledgement posted for my articles and images.

I’ve often defended against leeching by substituting another image – something a little less appropriate for the thief’s original purpose, usually. There’s not much you can do when your images or content are posted into someone else’s server space. Legal fees are prohibitively extortionate – £150 per hour is mate’s rates around here and it’s only lawyers that can afford that kind of expenditure.

Encouraged by my friends’ assertion that I have the law on my side, I’ve just billed a well-known Scottish University £200 for Royalties for an image (a drawing of mine) they have stolen from one of my sites which is clearly marked copyright. Ironically, on the site where they have posted the image, there is a notice for students which states: “Note: Don’t post other people’s pictures to your blog without permission” a few lines down from the infringement.

I’ll let you know how it goes. UPDATE 28 March 2013: They paid up.

The myth of the digital immigrant

grandadThe trouble with popular wisdom is that it is more popular than it is wise. One of the recent truisms is that young people are much better with technology than their parents and teachers are: this has been hackneyed into the collective term, “Digital Natives”, meaning those people too young to remember a time before the popular use of computer devices. Those who are too old to be included in this group are branded, “Digital Immigrants”. The caricature of the latter group is of the hapless user struggling to make sense of new devices and technologies, not able to communicate their difficulties because they lack the “entirely new” language of the natives.

EDCMOOC Readings, week 1 (part 2) – on education

One of the most pleasurable things about being an educator who came late to education after decades in rather more robust environments, is that every day there is the comedy of somebody discovering some new truth or embarking on some new initiative to improve this or that. Attainment, usually, whatever that is. I recognise many of the things we threw out as useless wastes of time in industry in the ’80s as being the bee’s knees of 21st century education. Much of the Newspeak comes from a whole industry of social researchers screaming the relevance and imperatives of their findings in the familiar tones of the operational researchers of industry thirty years ago who gave us AQAP, PRINCE, Agile and IIP. The characteristic of company evolution in those days was: (i) IIP, (ii), Knighthood for the chairman, (iii) fountain in the lobby, (iv) redundancy notices. Smart stock traders knew these signs.

Having set out my prejudices (or perspective, you choose), I’ll start my thoughts on the EDCMOOC readings with the fact that I was irritated by the Dahlberg reading as having the characteristics of arbitrary and pointless analysis. Social artifacts, behaviours and vocabularies populate the pages in an analysis well suited to the Golgafrincham B Ark.

The Social Darwinists

Prensky’s paper, in which the Digital Native/Immigrant terms were first coined, is according to Wikipedia, “seminal”. I think the terms have become influential in educational circles as teachers crave gadgets, gadget makers sell into ignorant educator buyers and careers are made in the squabble that is the justification for advancement in schools. The winners haven’t been the children. Interactive Whiteboards at a couple of grand each are scattered across the educational landscape with little or no pedagogical justification. iPads are being thrown at children. Prensky is for me anything but seminal (except perhaps connected with the second dictionary sense), rather he writes using emotive terms such as “singularity”, an abuse of physics if ever there was one, to make unjustified and unchallenged claims like

today’s students think and process information fundamentally differently from their predecessors

which is patently untrue, and yet swallowed by a global deputy-head’s mindset looking for some new initiative with which to make their mark. The brains we have, have taken billions of years to evolve the way they work. Except perhaps through the use of very slightly different language, they work the same way now as they did a hundred years ago. What hasn’t changed, perhaps, is the way we view our children:

When I was young, we were taught to be discreet and respectful of elders, but the present youth are exceedingly disrespectful and impatient of restraint (Hesiod, 8th century BC)

So, to suggest that one has to have been born before after such-and-such a date to be down with the tech is to create an artificial barrier between educator and learner, and a whole bundle of excuses to go along with it.

The Machine is us

Wesch’s video from 2007 describes a view of the evolution of hyperlinked technologies until the start of popular use of social media. Again, it is a semantic description of a part of the whole and takes the perspective of the user of the machine, ignoring infrastructures that are fundamental to the operation of the machine. It’s like discussing the development of driving without considering roads or the internal combustion engine.

So, what of education in the new world?

To quote Ewan MacIntosh, “it’s about the teach, not the tech”. Noble hints at the commercial imperatives driving education and this is  a consequence of the society we built. From the clamour for interactive whiteboards without ever considering why such interactivity was justified where the blackboard used to be, to the monetisation of education media channels, there is a pressure from commercial interests on educators, driven by the demand perceived when, for example, 40,000 people sign up to A New Thing.

Conclusions

I am generally between thirty and forty years older than my students. I know a great deal more about the technology than all of them. I make it my business to. Whilst I might make the occasional naive reference to emptying clips in World of Warcraft, I have the better knowledge, not only of the technology but also the pedagogy, than my students. I will not be intimidated by the social Darwinism of “Digital Natives and Immigrants” because it’s false. There is knowledge and competence in technology, as in other things, and these aren’t the domain of a particular age group except to say that if you’re older, you’ve had longer to learn.

Next

The future. Where do we go from here?

Looking to the past: EDCMOOC Week 1

solitaryHow was the first week for you?

You would think that on a course with over 40,000 registered students, the experience of participation would seem something other than solitary. Yet, this is how it has been for me: snooping in voyeuristically upon snippets of twitter conversation, trying out and rejecting the Google+ stream, or casting about the coursera pages. Even the Google hangout, which I watched as an embedded YouTube feed, felt like an hour watching five people giving presentations, podcast-style, to an internet audience with some random pickup from the twitter hashtag. This isn’t to criticise: the feeds, hangout and other official pages have been useful in getting me focused on the task of engaging with the course.

Technological Determinism

Block 1 of the course is concerned with how digital culture or digital education can be viewed as utopian or dystopian. Information Technology is described in these views as having built-in properties which are either democratising or de-democratising. This influence of technology is seen as driving social structure and cultural values: further, technology has been said to develop along predictable paths with society organising itself to support and develop the technology once introduced. The film Bendito Machine III characterises this view of technological determinism within the setting that the technology is provided by some higher power: it is as if the technology is something divine or other-worldly. I am reminded of Arthur C Clarke’s third law of prediction:

Any sufficiently advanced technology is indistinguishable from magic

Whereas Bendito Machine III acknowledged the erosion of social interactions caused by the introduction of new technologies, the second film, Inbox, celebrates that new interactions are made possible by new channels of communication. These channels seem to lend themselves to serendipitous meetings and, despite the occasional tech failure (the ripping of the bag – it had to be the boy, didn’t it?), a happy ending is had. This utopian view of the development of communication is what makes me appreciate the age I live in: the channels are rich and manifold and I can choose to participate in them or not. When I do, my life is often (although not always) enriched.

Technology is different, or a natural development?

The third film Thursday, depicts a couple living in a technologically dominated world but I can’t help feeling that it isn’t the technology itself that represents the “differentness” Correa-Martians_vs._Thunder_Childfrom nature. For me, the technology is just one aspect or manifestation of a more general urbanisation, itself a product of the evolution of our species and its habits. When we moved from being hunter-gatherers to settlements and the adaptation of the environment instead of adaptation to the environment, did we establish the behaviours of natural adaptation that lead inevitably to the development of technologies like the iPhone? The final film of four, New Media, looks like the opening sequence to a film like “War of the Worlds” about conflict on Earth with superior aliens and their machines of (our) destruction. The nightmare cameo of the alien pipe connected into the human figure is evocative but no less so than the plume of smoke which for me heralded the beginning of man’s fight back against the superior power – maybe I’ve seen too many of these. I look for Thunderchild.

Readings: Chandler

The reading by Daniel Chandler is a kind of idiot guide to Technological Determinism and I think it told me a lot more about social science (and why Brian Cox suggests that social science is an oxymoron) than it did anything else. A couple of examples will illustrate my take on this.

Nature vs Nurture

I’ve been reading articles on nature vs nurture for over thirty years now, since I joined Mensa – a vanity society of people who know what shape comes next. I think it’s a populist media habit to try and stir up the passion (increasing circulation) by offering two “opposing” choices and nature and nurture have been favourites for those choices for a long time. The argument is evidently false and oversimplistic: in trying to decide which of genetic or environmental is a deterministic cause of influence, the combination of these two is ignored as an invalid choice. Consider, however, the device on which you are reading this: is it the hardware (what it’s made of physically – its nature) or the software (the programmed instructions it is following – how it has learned to behave) that decides how well it works for you?

Sapir-Whorf hypothesis

This interesting “hypothesis” (it never was such a thing) is said to assert that thinking itself is restricted by the language of the thinker. I have no problem with this idea if the linguistic processes represent how the brain is programmed, including the semantics and structure, grammar and resolution of the language. In exactly the same way a different operating system can dramatically affect what a computing device can do – take your old Dell laptop and replace Windows with Ubuntu to see what I mean.

Reification

If technology push is evolution, then demand pull is Intelligent Design. I prefer the former as a model, although design improvements can be market-driven (but demand is not synonymous with market here). Demand all you like, there’s no technology going to iron your shirts for you. In describing reductionism, Chandler suggests that technological determinism focuses on causality – whether is it mono-causal or “independent variable” suggests that there is something scientific in this argument but this isn’t science: it’s scientific method, yes, but that’s a different thing from science.

So, is technological determinism a “thing”? I think it’s a phenomenon, something that we can describe and define in human terms without actually making it real, in the same way as we can define evolution. These are constructs that allow us to talk about them – features of our language, only.

Free will and the quantum

Hydrogen is a colorless, odorless gas which, given enough time, turns into people (H. Hiebert)

Adaptation and evolution is the manifestation of random variation in subatomic phenomena. They are inherent in the physics of the universe. Technology, like urbanisation, is the manifestation of our continuing evolution as a species. The truly amazing thing to notice is that man is not the only species doing it, and with 17 billion Earth-sized exoplanets just in our own Galaxy, the possibilities are beyond imagination.

Next

Perhaps a consideration in a narrower context: education.

What shall we call it?

tagxedoWell, I’m exhausted. I’ve been mooc-hing about the various online and social media manifestations of edcmooc – the E-Learning and Digital Cultures Massive Open Online Course – and I’m beginning to get the idea that this is certainly a massive community. There’s no point in blogging how many people are in on it because by the time I get to the end of the sentence it will have changed again.

What I am discovering is that I’m not as up to speed with the web tools and channels which might be useful as I thought I was. There’s the problem with having to work for a living.

Still, never mind. I’ll share one with you that I didn’t know about, called Tagxedo, which does the clever wordle thing but from an easy interface – give it a RSS, twitter, or whatever, and it’ll produce some nice interactive widget which you can blag for your blog (see above example) for free. It’s nice.

Anyway, pardon me, I’m going to have a lie down, having had a look around some of the edmooc stuff – Facebook, Twitter, Wiki(spaces), virtual school, Pinterest, Diigo, Google+, GoogleDocs, Google sites, maps, Meetups, Flickr – all in use and the course doesn’t start for another two weeks.

So what is this massive community doing with all this stuff? Basically, saying hello, pointing at other bits you don’t know about, trying out stuff, deciding what colour it should be, whether we should all go down the pub or meet in Barstucks, hello, what do you do, I’m in media, oh wow, and isn’t this all lovely, wouldn’t that cheeseplant look better by the window kind of stuff.

I can’t make up my mind if I’m over or underwhelmed. Watch this space.