Archive for the ‘Science & Technology’ Category

Let me start with a question. Has any KT session you have attended ever left a mark or helped you in the long run? There will be very few who can say yes to this with any degree of confidence. They would be the lucky ones. For the rest it is just an interminable few days of listening to a monologue by some person who does not really care whether you understand or not. I know because I have been such a person, on occasions.

For the uninitiated, KT, in information technology parlance, is knowledge transfer. This is one part of the larger, organisation-wide practice of knowledge management. This is a process for disseminating very specific, concentrated chunks of information, accumulated over the years by hard work and bad luck. The Agile manifesto puts it quite elegantly as:

“At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behaviour accordingly.”

Right from the plain-Jane shared drive, containing relevant (mostly irrelevant) documents to the US Army standard, After Action Review (AAR), there are several ways of passing on what one has learnt. Gartner’s Darin Stewart highlights one aspect of the AAR in his blog which I think is very significant – no recriminations. During these reviews whatever is said cannot be used by anyone to assess an individual’s performance in the project or task. The rule is – no one shall play God and denounce thee.

The IT industry, especially Indian IT, is service-oriented in nature. There is a very obvious master-slave relationship (crude but close enough :-)). The only thing missing is the line “As you command, Master” from people’s email signatures. Nevertheless, processes are taken seriously, without any push from the customer, and reviews are held frequently. Any post-mortem reviews (appropriate right?) held, delve into a great depth. There are pages of reports produced, actions assigned (notice the verb), superiors are informed and 15 days later everyone has forgotten about it. This is very true in case the previous milestone was a success. And is even more true if it was a failure.

In the latter case, it becomes very obvious that people are trying to cover their backsides in the first 5 mins of the meeting. How many times have you seen a colleague trying to hunt down an old email which can prove that he/she had proof of knowing that the system would fail but did nothing since no one approved it? During the annual performance appraisal he/she can then throw the email at the supervisor’s face and demand why the rating was so poor. One of the most amusing things I have seen during these review meetings is the formation of a review team to ensure such slips do not happen again. Quite like the committees our politicians are good at setting up and with the same end result. No one remembers what the review team was supposed to do.

I’m digressing. My point is such review meetings, post-failures, tend to be a blame game. And the one who is not vocal enough or smart enough to duck, is blamed. It is a natural reaction to fear from any negative action on one’s person and protect oneself from it. Instead, like the US Army, these review meetings should concentrate only on disseminating what was learnt during the failure. When the team knows that any disclosure of information in such meetings will not lead to a downgrade in the performance assessment, then there will be a far better release and absorption of these ‘lessons learnt’. It would then not matter whether anyone kept any minutes of that or not. It would simply transform into a group discussion, with each person retaining almost all of that which was discussed. If this kind of a philosophy works for the US Army, which I think you will agree has more important work to do, it would work on a team of geeky, overweight software developers.

There are caveats, of course. To catalyse such a discussion, a skilful moderator, who has great people skills, would be needed. It is a role where trust has to be implicitly established. The team will need to understand that humans make mistakes and that it takes another human to recognise and accept such mistakes. The moderator has to be this person. Is it Christmas yet? A miracle is required in the corner IT shop.

One can argue that essentially overlooking people’s mistakes will only cause more harm and that people will not try to change themselves until someone comes at them with a stick. I beg to differ. As Dr Frasier Crane used to say, I believe in the basic goodness of humanity. People will be ashamed enough to change when they see others overlooking their mistakes (it’s not wishful thinking, its Gandhigiri :-)).

Advertisements

In one of the episodes of the Big Bang Theory, the protagonist tries to garner support and a research grant for some experiments in advanced physics. He is successful but in a rather twisted way. This example shows that even popular culture knows that any project in advanced physics will be very expensive. Proving any hypothesis requires loads of money which certainly does not guarantee success. Corporations and governments spend billions in aid to various such projects in the hope of a breakthrough. So how much do you think the most expensive project undertaken costs?

Thirteen billion Euro. Yes the figure is correct. This is the estimated cost of the ITER project (International Thermonuclear Experimental Reactor). This is a collaborative attempt by 34 countries to prove that energy can be produced using thermonuclear fusion outside a laboratory  Its aim is to produce about 500 MW of power with an input of about 50 MW. When you have the time visit their site. The numbers are astounding. Astounding because this is essentially a proof of concept. When this is complete (should I also say ‘if”?), scientists involved will have categorically proven that it is possible to build commercially viable nuclear fusion reactors. Next someone will have to actually build a reactor so that countries can buy one. It will have to be a country, I guess, given the likely cost of such a thing.  All this will take decades. The earliest date for the prototype to start producing energy is in 2026.

fusions-missing-pieces_1

Image: ITER Lego sculpture by Sachiko Akinaga, Photograph by Hironobu Maeda, Credit: Scientific American

So is this the right thing to do when the world needs solutions for its energy problems quickly? Its naysayers would simply dismiss it as a colossal waste of money whereas the scientists involved will hail it as mankind’s saviour. Scientific American had a recent article about various problems it was having (funding of course is talked about). There is no question, however, that fusion once domesticated will be our main source of energy. Till that point the world should be looking at alternatives.

Easier said than done that. And yes most countries are looking at other energy sources. Batteries, more efficient solar panels, hydrogen fuel cells and new deposits of natural gas to name a few. Hang on, something’s missing from this list! Where is nuclear fission? The world seems to have lost faith in fission it seems. New fission reactors are a big no-no nowadays, after Fukushima of course (but warheads are obviously multiplying). This is more of a knee-jerk reaction. Something similar would have happened after Chernobyl or Three Mile Island. There are people thinking about making fission safer, cheaper and less damaging to the environment. The most noteworthy example is Taylor Wilson. You might say – who? I would have if not for a recent talk on TED which I saw. Wilson, then 18, won the Thiel Prize in 2012 as an Applied Nuclear Physicist. And he’s built a fusion reactor in his dad’s garage! In the talk, he talks about a design for a modular nuclear fission reactor capable of producing 50-100 MW. If the design works, it may not be as expensive as a conventional reactor and probably not as complicated. This means it will be easier to build and maintain, especially in the developing economies.

Someone should be looking at a comprehensive plan for energy. Someone with the reach of the World Bank or UN. And they should take a leaf out of Agile and start showing quick results for the almost-panicking world.

PS: Here’s my 4-point plan for energy (go ahead, laugh :-))

  1. Reduce energy consumption. Any which way. See this post by T K Arun (Times of India blogs) as an example of the lengths one can go to.
  2. Develop hydrogen fuel cell infrastructure for vehicular usage. Period.
  3. For the near term, look at fission, solar, wind and hydel energy solutions. Move away from burning anything having carbon in it.
  4. Make fusion cheaper. If a teenager can build a working prototype of a fusion plant in a garage, why does one need to spend gargantuan sums of money elsewhere to build a different one?

I follow Gartner and Forrester blogs on a regular basis. Both consultancy firms (is it right to call them that?) talk a great deal about Master Data Management (MDM), its benefits, pitfalls and the lot. They, being professionals, would have done detailed analysis and dozens of surveys to arrive at a conclusion on what MDM is, what to do and what not to do with it. I have learnt a good deal about the topic from these blogs. But I do find something missing.

Wikipedia describes MDM thus:

“Master Data Management (MDM) comprises a set of processes, governance, policies, standards and tools that consistently defines and manages the master data (i.e. non-transactional data entities) of an organization (which may include reference data).”

Other sites essentially describe the same thing in different terms. In short, it is a mechanism to arrive at and maintain a single version of truth about the customer’s data, held by an organisation. A common example used to illustrate this is using a customer of a bank. Let’s say a customer buys an insurance policy from his bank and a week later he gets a call from the same bank, trying to sell a similar kind of policy. Such a thing is quite normal (and annoying) and most of us would have faced it. Why would this happen? Doesn’t the bank know that you have just bought an insurance policy? This would suggest that either the bank does not have an MDM solution in place or that the MDM solution is ineffective.

Take another example. I have a savings account and a credit card with a large international banking corporation. I had applied for both at different times and, unfortunately, in one I had expanded one of my initials while in the other I had not. To escape the hassle of maintaining multiple logins for each product, I had initiated a request for the accounts to be merged, However it was declined due to difference in the names in their database. I was a disappointed to say the least.

In both situations, it is customer who has to compromise and live with the situation, to let the bank manage his or her data. Customer  or User experience (UX) – this, I think, is missing from what MDM is taken to mean.

Like the 12 principles of Agile, one of the tenets of MDM should be to be an enabler in creating a seamless, consistent user experience for the customer. It is unlikely that any MDM implementation would be wide enough in scope to actually define the UX component of the business. However, from the perspective of enterprise architecture practice of the organisation, the MDM solution should open gateways for the UX to offer a consistent experience.

An MDM solution, then, should look at how well existing business processes dealing with UX can be integrated. And propose changes to these processes and the associated IT components.

Drawing1

Wikipedia can then say:

“Master Data Management (MDM) comprises a set of processes, governance, policies, standards and tools that consistently defines and manages the master data (i.e. non-transactional data entities) of an organization (which may include reference data), allowing its customers to have a consistent, seamless user experience.”