Higher Education IT

what to do when students don’t want to learn

The strangest thing happened to me the other day while helping train some students on a collaboration suite we have at work.  The professor had already explained that students would be working in teams, with their laptops, to develop the beginnings of a legal outline.  They would use the software/hardware solution – powered by Tidebreak’s Teamspot software – to interact with the shared document.  Hopefully, the students would discover greater productivity and learn more about how to develop the outline through this collaborative effort.

That’s all and good.  It’s to be expected and the professor had a really great and open mind about how to use the suite that night, possible uses in the future, and ideas for best practices.  What was not expected was students asking me this:

“Do I have to take part in this project?”

or

“I downloaded the software.  Now I’m trying to decide if I want to install it.”

My basic response to these comments was along the lines of “well, it’s for a class assignment, and I believe the professor wants all of you to work as a team, with your laptops, on the shared display.”

This did not produce much of a response.  In fact, one of the students showed a rather blatant disdain towards me and commented on how she was willing to install the software only because she was using a Mac.  She would not have done so with a Windows machine.  Clearly implying that for some reason I would be recommending the installation of software that was either going to compromise her computer or, even worse, was malware to begin with.

This honestly baffled me.  On the surface, I couldn’t believe that students would question whether they should take part in a class assignment, as requested by the professor.  Even if it’s a situation and/or environment with which they are unfamiliar – this is the professor asking them to do an assignment that is, ostensibly, critical to their education (are assignments ever not critical to your education, really?).  And while a bit of caution when installing software is always prudent, why would someone think that I, an assistant dean, would be asking them to install malicious software?

We in Educational Technology are in the business of improving the teaching and learning of our respective fields (or, perhaps, to challenge existing paradigms that have generally governed those methods).  Most of the time, we seek out innovative technologies but, more importantly, act as consultants and work with faculty to find a tool – whether new or old, cutting-edge or somewhat banal – that will help them in their tasks.  We are here to help improve things.  We are not here to cause problems, to decrease productivity or success potential.  We are certainly not here to put viruses on your computers.

I’ll admit that when I introduce myself I was sufficiently befuddled that I sounded 100% geek.  I didn’t get into how part of my department’s charge is to introduce innovative technologies and methods to faculty.  I sounded like a robot.  A nervous, very geeky robot.  But that’s not the point.

If I’m there, it’s because the faculty member has agreed that there is potential for benefit and/or improvement to teaching and learning.  If I am asking you about your progress with some technology, it’s because I want to see how far along you are in getting set up to use that technology to improve your classroom experience.

Netgen, Gen X, part-time, full-time – whatever student category you fall into – understand this.  The faculty, the staff, and certainly my department are there to help.  It is to your benefit to let us do that.

cybernudity

A while ago, I read an article describing what is commonly-called the “NetGen” or “Gen Y” (loosely defined as those born since 1982, though I don’t personally agree with that demarcation) as being very comfortable with “cybernudity.”  In general, this means that those that have grown up with computers and the internet as everyday tools of life (as compared to a “new” invention that has changed the way one works, interacts, etc) have not real problems with the world knowing everything about their activities, their interests, their religious feelings, etc.  You can see this everyday in seemingly pointless Twitter posts and Facebook status updates.

The article also claimed that the NetGen is simultaneously fiercely protective of its identify security.  While these individuals don’t mind if strangers know that they are at the local coffeeshop meeting with friends (via a FB Places or Foursquare check-in, perhaps), anything that would lead to identify theft is completely out-of-bounds.  You can know where Jane Doe is, but you do not get to be Jane Doe, no matter how “cyber-naked” she is.

This brings us to an interesting place – sites such as spokeo.com and many, many others hook up to Facebook, LinkedIn, Yellow Pages, personal blogs, etc and aggregate all of one’s personal info.  You can look up a person by name, drill down by state and city, and find out a great deal of information.  This includes stuff like family size and wealth, type of residence, and even street location.

On the one hand, this kind of information is exactly what one would need to steal one’s identity, short of an SSN.  On the other, the places from which this information is gleaned if often required to be public.  Consider:

  • Facebook still defaults far too many things as public, meaning that one’s personal info on the largest social networking site is right there, in the open.
  • LinkedIn, by nature, needs to have a detailed public profile in order for professionals to find each other.
  • The Yellow Pages is a critical component of running a business effectively, which means you are putting your information as owner out in the open.
Those are just three examples.
So, if the NetGen likes being cyber-naked, but wants to protect identity, but also needs to be visible in the right places in the right ways, and still ends up on spokeo…what to do?  The line gets fuzzier, indeed.

Monterey College of Law Pilots iPad Programs for Students and Faculty — Campus Technology

Monterey College of Law Pilots iPad Programs for Students and Faculty — Campus Technology.

A professor here at the Law School forwarded this to me recently.  He didn’t say anything in his message.  He just sent the link.  I guess I would have appreciated an attempt at something other than saying “I want an iPad too” but I’ve learned to manage my expectations these days.

There are a few interesting aspects to this post, some more meaningful than others.

  • It is tied to BARBRI, the Bar Exam prep program.  Programmatic backing is always a critical component to any initiative.  If there is no clear purpose, tied into a practical activity in which the end-users are interested, then it’s likely to be dead in the water.  So that’s good.
  • The main point cited for providing iPads is because students learn and faculty…do scholarship outside of the classroom.  Well, they have done that outside of the classroom for quite some time now.  On the student side, I can see where a new interface to this content can be meaningful.  That is good.  But faculty clearly aren’t teaching via the iPad (at least, not likely).  They are not likely creating content via the iPad (possible, but if you’ve met law faculty you’d know from where my skepticism comes).  And the iPad is not the device for doing scholarship.  That’s not so good.

Interesting idea.  Poor reasons cited in the article for the effort.  Sounds like more hype than content.

IT is difficult

FYI:  I struggled with the title of this post for days.  No matter what I did, I felt like I was writing something a 14-year-old would do and laugh about.  Very sad.

I often start my work-related posts with a qualification that I fully realize the difficulties that face university Central IT.  I make my comments about technology in higher ed and my opinions about the best ways to implement such technology and policies purely as my own opinion, but also with respect to the hard work of my university colleagues.  My opinions might be contrary to not only the university’s actions but even to their policy (or maybe even to their way of thinking), but that doesn’t mean I don’t respect their efforts or the challenges that they face.

I’m taking this one step further – the university has been under tremendous fire for communication, governance, and policy issues with its “IS” department.  IS is made up of Central IT, Media Services (classrooms, media support, etc), and the library.  In reality, the problems and complaints have been mostly about Central IT, with a bit of bleed-over to Media Services.  Most importantly, this has come from all sides – the faculty, the staff, departments as a whole, an external committee, and even the national accreditation group used by the university.  This is a big deal, and my perception is that IT is under a lot of pressure right now.

Interestingly, a university faculty member wrote the entire staff mailing list (why just anyone is allowed to write to the list is a whole different discussion, though perhaps related to the fundamental message of this post) praising a presentation by three managers in IT at a symposium.  These three – one of them the director of IT – spoke of the difficulty of providing an enterprise level service at a university, the challenges that any large IT infrastructure presents, and the type of staff power (both quantity and quality) needed to provide services that many people take for granted.

The real issue, however, isn’t whether IT at a university (or anywhere, really) is hard.  It is.  Nor is it just that our IT department has provided sub-optimal services at times.  It has.  These are very black and white perspectives that ignore some fundamental, cultural issues.  And difficulty in provision is never, ever, an excuse for low quality of product.

For instance – the difficulty of setting up and maintaining a university infrastructure is unimpeachable.  But the methods through which one builds such a system, and the policies that govern the development and growth of such an environment, must be examined closely.  To simply say that “it’s hard – acknowledge that and we can all move on”  is a gross oversimplification, and an insult to those that try to provide high quality service in such a “difficult” environment.  The complaints expressed by many during the external committee’s “Open Forum” session were often far too vitriolic and ignored the effort needed to provide the services we had.  The e-mail sent to the staff, in turn, ignores that just because something is difficult doesn’t excuse those responsible from mistakes along the way or for not remedying those issues since their emergence.

For instance – from my outside perspective, I have no idea whether the topic of outsourcing of services – cloud, third party, whatever you want to call it – has been discussed properly.  I do not have any clue as to whether the General Counsel has put forth our official stance on having sensitive data on someone else’s servers.  I am fairly certain that a stringent review of our business processes, our personnel, and an evaluation of what we actually need so that we can find the best solution has not been conducted.  I am pretty sure that we’ll go to Google Apps for Education just because that’s what everyone else is doing.  But that’s still probably 20% guessing and another 20% educated conjecture.

Not even all the right people are included in the conversation.  Why am I not better informed of what is happening?  No, I’m not a vice president or provost at the university – I’m not even part of the central university.  But I am the head of technology (CIO, whatever) for the law school.  We are the only other 100% full tech shop on campus (everything but e-mail, ERP, and networking).  Why am I not at the table?  Why have my requests (yes, I have been proactive) to be included on whatever committees come up been met with silence?  I am at the point of leaving vague messages about “however I can help” and “just say the word” in an effort to be informed.  Whatever process they do have in place does not include looking for outside opinions, as far as I can tell, empirically.

This is just one example, but the fundamental issue is that there is a procedural gap, flaw, fault, undersea trench that no one seems to see whilst they are viewing only the extremes.  Doing IT at a university is hard.  But that means that it’s all the more important to be as smart, as considerate and thoughtful as possible.  That all options must be weighed and that things are done the right way.

the disappearance of the life of IT folks..or not

One of the most common “issues” and topics of discussion among IT professionals in higher ed is our potential obsolescence in the face of the changing student population, the infusion of uncontrolled media, and non-university solutions for connection – IM, Facebook, etc.

There are various articulations of this fear, but the gist is that because of all of these changes, the way we have always done IT will no longer be relevant, and we will lose our jobs.  Or, at the least, that we need to watch for and perhaps even fear these changes.

I am, as I begin this post, attending a keynote regarding the paradigm shift that social media, desktop servers, cloud computing, and other technologies present to (university) IT departments.

Let me rephrase that to work better for me:  the SUPPOSED paradigm shift…

As I often do, I must preface the rest of this post with a bit of a disclaimer.  The keynote is by Sheri Stahler,the Associate Vice President for Computer Services at Temple University.  She is clearly an intelligent person and I’m sure she’s a great VP and manager.  She certainly is a very affable and friendly person – at least she was when we ran into each other in the elevator at the hotel at which this conference is held.  This is not a criticism much less an attack on her in any way.  This is about the points being made.  These perceptions are not uncommon in higher ed (certainly evidenced by some of my fellow attendees that raise their hands to certain queries posed by Ms. Stahler) and that truly and deeply worries me.

Ms. Stahler’s points surrounded a supposed paradigm shift caused by web 2.0, 3.0 (2.0 + federated ID via Facebook Connect, etc), social media, and the changing perspectives of today’s students.  This shift jeopardizes the very jobs of IT staff in higher education.  Our methods are no longer effective, and our jobs are in danger.  This is a gross oversimplification, admittedly.

I had the pleasure of convening and attending a presentation by Dr. John Hoh, the Director of Information Technology Services at the Harrisburg campus of the Pennsylvania State University later this same day.  While it’s awfully difficult to describe the entire session, the gist is that one must look strategically and quite critically at one’s service portfolio, identify what are commodity services that can be outsourced, what are high-maintenance, low-value services that should be handled by only a small set of staff, and what is the “meat” of your overall services.  The stuff that you want to be good at, and that you want others to know about it.  Determining this requires a very forward-looking perspective on matters. As Dr. Hoh said, the goal is to become solution-providers, not break-fixers.

Being a solution provider means that one can identify issues, see trends as they emerge, and move to take advantage of those trends as appropriate.  If one is a solutions provider, then one’s job cannot be, by definition, in danger.  It is the very nature of one that needs to see emerging technologies not just for the dangers they pose to our existing duties but also for the opportunities they present that future-proofs such staff from becoming obsolete.

Even without taking Dr. Hoh’s aggressive, progressive stance, I would argue that we are all in the business of analyzing the eco-system that includes technology and higher education.  In the same way that we must now consider how to deal with the emergence (eruption?) of the tablet device or the commoditization of Help Desk services, IT departments had to previously examine the commoditization of personal computers and the emergence of computers as a part of everyday academic life and develop those very same Help Desk services.

In conclusion, we must look at ourselves as solutions providers, and ones that determine those solutions based on our ability to analyze changing scenarios.  We have never just been IT folks, and we certainly should never be people that focus on how the “way we’ve always done things” is or is not threatened by change.  Our jobs should be to analyze and change with new trends.  While our duties might change, our job does not.

stuck in an analog world

Last week, during my budget meeting, I got to “see” a great tool that our finance officer had put together.  It was a spreadsheet, true, but anyone that has worked with really complex ones knows that a properly designed sheet that has every reference done just right and provides the right data is as valuable as the $10,000 server software running on the $15,000 server in the data center.

What was weird is that, after being told of this great file, I was given a paper copy of what it looks like.  I didn’t get to see any of its dynamic nature.  I didn’t get to punch in my numbers and see how my proposal and/or its variants affects other parts of the school.  I didn’t get to interact with it.  It was an inherently digital artifact in analog form.

This struck me as a classic misalignment of the traditional meeting room and the digital commons (or some small version of it).  Meeting rooms are about handing around stacks of paper, scribbling down notes, and then (hopefully) filing all that away in a place you can find later.

Working together in a digital commons is about interacting with files such as the one described, looking at different scenarios and sharing information via various collaboration tools (maybe I could import the data quickly via a cloud-based sharing tool.  Or have it already in that tool and available as part of the numerous other cloud-based budget folders shared to the finance officer).  Taking notes would be done on, say, a tablet, where one does direct, digital markup of the original proposal.

Everything stays digital.

Not every meeting should go this way.  But one that is based around a dynamic, digital file…that probably should.

everyone’s got it all figured out. so why do we still have problems?

[needs some editing, but something I want to get out]

I attended a rather interesting session today at the AJCU-CITM conference on the future of technology and how CIOs at today’s universities needed to respond. We had an article on Gartner’s Predictions for Technology Trends in 2011 addressing the “consumerization” of highly capable mobile technology necessitated a change in how we managed technology. The article itself deserves a long post itself, but the session was the interesting part.

In a room of CIOs at major universities, supposedly all facing tremendous challenges in managing tight budgets, administrative pressure, and creating productive teams that would change the nature of IT, everyone apparently had it all figured out.

“How many of you have Business Analysts in your group to examine processes?”  Lots of hands.

“How many of you have strategic plans that outline your organization’s goals?”  Lots of hands.

“Do any of you have dedicated project management offices?”  Lots of hands.

If, in fact, these are the elements for a successful IT shop, and one that increases the Information part of IT, why do schools still face such challenges?  If everyone has it all figured out, why do we even fact problems at all?  Why are we not already the most nimble, agile organizations that will take higher education technology into the next 15 years?

Don’t get me wrong – I don’t have any of these answers.  After the session, I’m not sure whether I’m even doing my part in providing the right information to our students.  I’m not sure that we’re providing the right services to our community, and maybe whether I’m actually doing right by the students themselves.  I’m full of doubt.

But no one else in the room seemed to be.  Yet we all face these problems and no one is perfect.  So why did the conversation seem to repetitive?  And anti-climactic?

ubiquitous computing: holy grail in reach

About…6 years ago, we used to make fun of our bosses because he always talked about “ubiquitous computing.”  At the time, it was the catchphrase, and was thrown around like it meant something more than just this yet-to-be-realized utopian idea.

The idea that computing technology (define that as you wish, but I think most people will get within striking range of the same definition) would become so common that everyone had it everywhere was kind of the holy grail concept a while back.  It was logical – a generation of users that grew up with technology everywhere, devices getting smaller and more powerful by the day, and more and more connectivity as wireless networks popped up here and there.  But we weren’t there yet.  Phones were not yet smart, computers were still too slow, and the best we had was the few local wireless networks around.  Broadband cards weren’t even an option then (and the speed on those networks would have been unusable anyway).

We are almost there.  First, the iPhone got us 99% of the way.  Most people that know me would find it surprising that I’d give so much credit to an Apple product (I’m not against them, but I dislike the fanboy atmosphere that surrounds their products and the company).  But even if you look at Android phones, Symbian phones, WebOS devices…they all started from the iPhone.  An incredibly powerful device that also had an everyday purpose – cell phone – that made it simple, logical, and easy to always have around.  With faster network speeds, wifi options and increased competition, the features of smart phones in general have made it to the point where I do a lot of just plain web surfing on my phone.  Yes, I use the Google Reader to keep up with RSS feeds, and another app to find restaurants, etc, but sometimes I just do a google (voice) search and see what happens.  I use the phone like a computer.

Many would say that with these smart phones we’re already at the point of everywhere computing.  However, the last 1% that I leave out is a critical one.  How do we connect all of these devices and the content that they provide and, more important, that which we create, together into one big mesh that is our “work?”

Right now, I use dropbox.com and box.net to bring a lot of my stuff together.  I have the apps on my phone, the files on my computers and online.  I work on a file on my computer at my desk at home, email it to colleagues while on the train going to work from my phone, then open it up on my other computer at work to get more done.  My address book is all Google now.  I am thwarted in my efforts to keep my book up to date with work contacts (to keep it truly my “one” address book), but overall all of my contacts are there, in Google.  With the right software on the server side, I can now look at my work calendar and e-mail on my phone and any computer.  I can tie the calendar in with Google Calendars (kind of) and then view multiple calendars at once to know all of the things going on in my oh-so-busy-life.

As I look at students today, here at the law school, working with tablets and doing some really innovative stuff with organizing large amounts of data, part of me says “it’s really here!  now we have these ultra-portable, highly usable mobile devices!  ubiquity!”

But then I see someone e-mail a file to themselves so they can work on it later.  Or they have to refer to notes on the iPad while doing a full brief on the computer.  That is a massive disconnect, and one which we, as Technology staff in higher ed, must remedy.

it’s all in the words

I think that one’s choice of words can really say a lot about one’s perspective on many matters.  One can certainly read too much into words – trying to get tone and meaning out of an e-mail is an invitation to disaster and misinterpretation (today, I decided against sending an e-mail and opted for a phone call because I couldn’t find a way to write a response without sound cold…).

But choice of words can mean a lot.  A while ago, I was having a conversation about our respective departments and, therefore, staff.  It’s perhaps too subtle, but what I noticed was:

“they work for me.  they are supposed to do what I ask or tell them to do.”

“we work together to make our decisions happen.  if our actions deviate too far from the plan, I ‘correct’ things, but then we try and keep moving along.”

That’s a bit of an exaggeration on both examples.  But in both situations the staff do work for us, and they are responsible for making our plans happen.  And we are in charge of making those plans.  But the two descriptions couldn’t be more different in perspective.  And it’s probably safe to presume that there is an underlying, corresponding difference in approach to management.

Just some random thoughts.

an olive branch turned into stifled innovation

Disclaimer:  The Tech Steering Committee Innovation Grants offered by Santa Clara University are a terrific idea in support of those that have viable proposals to move teaching and learning forward.  Without these grants, many projects could not even get off the ground.  That these grants even exist at a smaller university is a testament to the commitment from the very highest levels of the university to innovative uses of technology in meaningful and hopefully important ways on education, learning, and university experience.

Having said that…a recent experience with the TSC grants has left a very bad taste in my mouth.  It seems contrary to the goals of the program, in fact.  It takes what was an olive branch offered in line with the very criteria for a proposal and turns things all around, potentially stifling innovation.

(more…)