How to Address Failure in a Job Interview
IT failures that occurred on your watch don't have to kill your chances of landing a new job. You just need to know how to discuss them in job interviews. Here are five tips.
By Meridith Levinson
Tue, June 28, 2011
CIO — For CIOs who experience some kind of enterprise IT failure in the course of their careers—whether a high-profile security breach, massive network outage, or multi-million dollar ERP boondoggle—the incident can feel like a career killer. But unless a CIO repeatedly makes the same mistake, or the failure stemmed from some illegal or "just plain stupid" action, it won't end a CIO's career, says Mark Polansky, senior client partner and managing director of Korn/Ferry International's Information Officers practice.
"There is as much—if not more—to learn from something that didn't work as something that did," says Polansky. "Every one of those experiences is a life lesson and goes into making you a better CIO."
CIOs who wish to recover from failure just need to know how to address suboptimal work experiences in their job searches and during job interviews. Consider the following five tips for addressing the fiascoes that occurred on your watch with prospective employers and executive recruiters.
1. Fess up. Don't ever try to hide failure; you won't get away with it. If an employer doesn't already know about, say, the ERP catastrophe at your previous employer, they will find out about it eventually. Better you be the source of that information than someone else.
"A good offense is the best defense," says Polansky, who advises CIOs to proactively address their failures in job interviews. "Always bring it up. Doing so gives the candidate credibility and shows that he or she has the confidence and sincerity to broach the subject of the failed project head on."
2. Anticipate prospective employers' concerns. When framing how you discuss your failure, put yourself in your prospective employer's shoes and think about the concerns they'd have with your candidacy, advises Peter Handal, president and CEO of Dale Carnegie Training.
For example, employers will want to be assured that such a failure will not occur inside their company. They'll also want the details around what happened, why it happened, what you learned from the experience, and how you will prevent similar events from occurring in the future, says Polansky.
3. Accentuate the positive. One failed project among 10 successful ones is no big deal, says Handal. He advises CIOs to put any failures they've experienced in the context of their larger successes.
"Explain that a particular project didn't go well," says Handal. "State the reason, and if you made a mistake, explain what you learned. Then point out all the other successful things you did."
4. Offer references who can back up your story. Make sure your references will corroborate your explanation of events when employers and recruiters call them.
"If the failure was outside the CIO's purview, such as a change in business circumstances or corporate strategy, I would have references who could confirm that information," says Polansky. He adds that executive recruiters and employers will contact the references the candidate provides as well as individuals they know in the candidate's industry to vet the candidate's story.
5. How you discuss your failure is sometimes more important than the actual failure. John Hamm, author of Unusually Excellent: The Necessary Nine Skills Required for the Practice of Great Leadership (Jossey-Bass 2011), notes that the questions about failed projects that come up during a job interview are sometimes a smokescreen. When a CEO asks a prospective CIO questions about a failed project, the CEO may be more interested in how the CIO confronts the failure, rather than the project itself, he says.
"The CEO could be testing for how the CIO sees the situation and if the CIO is willing to be accountable," adds Hamm, who is also a former CEO. "They know the [CIO] role is hard and that stuff fails. They want to know how you [the candidate] will handle it."
It is possible for CIOs to continue enriching careers in IT leadership despite high-profile flops staining their track records. After all, most organizations have some tolerance for failure.
"If you don't take risks, you can't try anything new," says Handal. "And if you take risks, you'll have some failures. Encouraging people to take risks is an important part of any organization, so you have to be tolerant of failure."
Adds Polansky, "It's always possible to recover. Sometimes, it just has to be at the right place and right time."
Meridith Levinson covers Careers, Project Management and Outsourcing for CIO.com. Follow Meridith on Twitter @meridith. Follow everything from CIO.com on Twitter @CIOonline and on Facebook. Email Meridith at mlevinson@cio.com.
http://www.cio.com/article/685259/How_to_Address_Failure_in_a_Job_Interview?page=2&taxonomyId=3123
Tuesday, July 12, 2011
Intel: Human and computer intelligence will merge in 40 years
Intel: Human and computer intelligence will merge in 40 years
On the company's anniversary, a future of sensors, robots and new thinking
By Sharon Gaudin
July 23, 2008
Computerworld - At Intel Corp., just passing its 40th anniversary and with myriad chips in its historical roster, a top company exec looks 40 years into the future to a time when human intelligence and machine intelligence have begun to merge.
Justin Rattner, CTO and a senior fellow at Intel, told Computerworld that perhaps as early as 2012 we'll see the lines between human and machine intelligence begin to blur. Nanoscale chips or machines will move through our bodies, fixing deteriorating organs or unclogging arteries. Sensors will float around our internal systems monitoring our blood sugar levels and heart rates, and alerting doctors to potential health problems.
Virtual worlds will become increasingly realistic, while robots will develop enough intelligence and human-like characteristics that they'll become companions, not merely vacuum cleaners and toys.
Most aspects of our lives, in fact, will be very different as we close in on the year 2050. Computing will be less about launching applications and more about living lives in which computers are inextricably woven into our daily activities.
"What we think of as a computer and what we think of as IT, in general, is likely to change," said Rattner, who has been at Intel for 35 of the company's 40 years. "The intelligent systems will move from being information systems to intelligent systems that will carry out a whole variety of tasks that we just won't think of as computing tasks.... The technology will find its way into so many things we do, and we won't even think about it. The explicit way we've done computing in the past will be there, but it will be a very small subset of what we'll be doing."
Intel hit its 40th anniversary last Friday. The company launched its first microprocessor in 1971, developed a processor with more than 1 million transistors in 1989, and late in 2007 packed 820 million transistors onto a single chip.
While chip advancements will continue throughout the semiconductor industry, technology advancements in general will start to change, according to Rattner.
"When you think back on where we were [decades ago] ... computers were still things that largely sat in big rooms behind big windows and were attended to by computing gurus or priests," he added. "In the 40 years, we've just completely changed the way people think about computers and computing. It's gone from a very expensive, very exclusive kind of technology to something that is unquestionably ubiquitous -- from the computers on our desks to the computers in our cell phones."
In the next 40 years, computer chips will extend beyond our computers and phones, as people want to become more entrenched in virtual worlds and computers learn to react to our motions and thoughts.
"When you see how intense the reaction is to things like the iPhone, with its use of touch and its sensitivity to motion, you begin to get a sense of, 'Gee, if machines understand the physical world and were capable of reacting to our voices, to our movements and gestures and touch, how much closer would we feel to them?'" asked Rattner. "At the same time, of course, we would like the ability to become more a part of these artificial or virtual worlds that are created entirely within the machine. We're starting to see, with things like Second Life and now Lively from Google, the ability of these machines to create these worlds that are much more comfortable for us to experience and be a part of."
As machine learning and computer perception progresses, machines will take on more and more human-like characteristics, he added. Recently, scientists have been putting electrodes into living neurons in living brains, but some researchers are working on ways to transfer brain waves and organic information without the electrodes, which wouldn't be physically intrusive.
"You can imagine a future where, in fact, not just our very senses will be engaged, but our thoughts will drive machine behavior," said Rattner. "You can see how that boundary starts to soften and begins to blur.... There's no question in my mind that the technology will bring these two unique and distinct forms of intelligence together."
http://www.computerworld.com/s/article/9110578/Intel_Human_and_computer_intelligence_will_merge_in_40_years
On the company's anniversary, a future of sensors, robots and new thinking
By Sharon Gaudin
July 23, 2008
Computerworld - At Intel Corp., just passing its 40th anniversary and with myriad chips in its historical roster, a top company exec looks 40 years into the future to a time when human intelligence and machine intelligence have begun to merge.
Justin Rattner, CTO and a senior fellow at Intel, told Computerworld that perhaps as early as 2012 we'll see the lines between human and machine intelligence begin to blur. Nanoscale chips or machines will move through our bodies, fixing deteriorating organs or unclogging arteries. Sensors will float around our internal systems monitoring our blood sugar levels and heart rates, and alerting doctors to potential health problems.
Virtual worlds will become increasingly realistic, while robots will develop enough intelligence and human-like characteristics that they'll become companions, not merely vacuum cleaners and toys.
Most aspects of our lives, in fact, will be very different as we close in on the year 2050. Computing will be less about launching applications and more about living lives in which computers are inextricably woven into our daily activities.
"What we think of as a computer and what we think of as IT, in general, is likely to change," said Rattner, who has been at Intel for 35 of the company's 40 years. "The intelligent systems will move from being information systems to intelligent systems that will carry out a whole variety of tasks that we just won't think of as computing tasks.... The technology will find its way into so many things we do, and we won't even think about it. The explicit way we've done computing in the past will be there, but it will be a very small subset of what we'll be doing."
Intel hit its 40th anniversary last Friday. The company launched its first microprocessor in 1971, developed a processor with more than 1 million transistors in 1989, and late in 2007 packed 820 million transistors onto a single chip.
While chip advancements will continue throughout the semiconductor industry, technology advancements in general will start to change, according to Rattner.
"When you think back on where we were [decades ago] ... computers were still things that largely sat in big rooms behind big windows and were attended to by computing gurus or priests," he added. "In the 40 years, we've just completely changed the way people think about computers and computing. It's gone from a very expensive, very exclusive kind of technology to something that is unquestionably ubiquitous -- from the computers on our desks to the computers in our cell phones."
In the next 40 years, computer chips will extend beyond our computers and phones, as people want to become more entrenched in virtual worlds and computers learn to react to our motions and thoughts.
"When you see how intense the reaction is to things like the iPhone, with its use of touch and its sensitivity to motion, you begin to get a sense of, 'Gee, if machines understand the physical world and were capable of reacting to our voices, to our movements and gestures and touch, how much closer would we feel to them?'" asked Rattner. "At the same time, of course, we would like the ability to become more a part of these artificial or virtual worlds that are created entirely within the machine. We're starting to see, with things like Second Life and now Lively from Google, the ability of these machines to create these worlds that are much more comfortable for us to experience and be a part of."
As machine learning and computer perception progresses, machines will take on more and more human-like characteristics, he added. Recently, scientists have been putting electrodes into living neurons in living brains, but some researchers are working on ways to transfer brain waves and organic information without the electrodes, which wouldn't be physically intrusive.
"You can imagine a future where, in fact, not just our very senses will be engaged, but our thoughts will drive machine behavior," said Rattner. "You can see how that boundary starts to soften and begins to blur.... There's no question in my mind that the technology will bring these two unique and distinct forms of intelligence together."
http://www.computerworld.com/s/article/9110578/Intel_Human_and_computer_intelligence_will_merge_in_40_years
Subscribe to:
Posts (Atom)