Tuesday, January 26, 2010

Simplifying != Lying


The Unified Modeling Language defines a lot of different mechanisms or views for documenting the structure and behavior of a system.  As you move your way through them, they provide greater levels of specificity or higher levels of abstraction.  I worked on an effort several years ago that employed a set of UML-like standard views to describe a set of key applications.  In that effort, the set of views were labeled Contextual, Conceptual, Component, and Deployment.

The Component and Deployment views came straight from the UML standard.  The Conceptual view followed somewhat from this Wikipedia definition, but extended beyond the entity-relationship domain to include processes in something of a data flow.  The Contextual view was used to describe the surrounding business processes, actors, and external systems that interacted with the application or system being described.

It was a challenging exercise to work through the application systems that I was responsible for and document them appropriately, without overdoing the documentation.  Personally, I found the exercise very satisfying.  It helped me prove out my understanding of the systems I worked with.  Often, I'd start with a middle-layer view, like the Conceptual or Component view, then work my way either down or up, iterating back and forth a few times to make sure the depictions remained consistent, honest, and valuable.
As a side note, that sort of iterative process of working non-linearly, both up and down between different levels of abstraction reminds me a lot of the YouTube videos I watched a while back when learning how to use an abacus.  If you aren't familiar with how to use an abacus for simple math, take a look.  If you're trained in traditional Western math, it'll be an eye-opener.

 When introducing the idea of these four views to subsequent teams, one of the biggest surprises has been the level of dishonesty that tends to appear in the higher levels of abstraction.  I've also found it hard to explain what it means to be dishonest about an abstract view of something. 

For instance, in a Conceptual view, I've seen people draw a box that is intended to represent one conceptual system when the box actually comprises two or more entirely independent processes.  For example, by this definition, it would be dishonest to diagram a single box labeled Item Master that has arrows sending data to separate operational systems if there is, in fact, no system or business process that actually manages a conceptually unified source for the item information in those systems.  Not that there has to be some physical thing or even one single source of truth, but if the item information is built separately in those two systems with no regard for the the other system, then it would be disingenuous to suggest there's even a conceptual idea of a single Item Master.  That's not to say that there should or shouldn't be -- but to document a conceptual view of that in the current state would be dishonest. 


I've seen this happen enough times that I believe there's some common confusion about what it means to build abstract models.  Simplifying how we look at something, using abstraction, should never result in a lie about how things actually work.

It's a lot like scientific mathematics 101.  Remember the lessons about significant digits and the difference between precision and accuracy?  Precision defines the degree of specificity and detail to which the answer is given.  A component diagram is more specific about the actual construction and inner workings of a system than is a conceptual diagram.  In order for them to be correct, both must be highly accurate depictions, though.  Losing accuracy during abstraction will only get you in trouble.

Friday, January 22, 2010

Slow Processing


When I was in grade school and junior high school, I was considered to be pretty good at math.  I went to math competitions all across the state and won 1st place more often than I didn't.  In most of the competitions, time didn't matter to the score.  It was a timed test rather than a race.  In my head, though, I needed to be one of the first to finish.  Being right was fine.  Being right and first?  That was really winning!  As you'd expect, that got me in trouble every once in a while.  I'd get sloppy on a couple of questions and not win 1st place.  Going back over the tests, I'd kick myself for not simply rereading the question a second time through.

My mother was the one who kept coaching me to slow down.  Read the question a couple of times.  Check your work.  I don't know if I've ever really learned the lesson to slow down and double check my work as much as I should.  If you've been reading many of my blog posts, I'm sure you've caught a plethora of typos.  Lucky for me, Firefox usually catches my misspellings.

Same lesson applies to how we process data for data warehouses.  Consider a somewhat typical batch ETL process:
  • Process input files:
    • Surrogate key lookups
    • Field level transformations
    • Calculations
    • Mapping to target tables
  • Load files into staging tables.
  • Apply updates and inserts.
Then, after the loads are complete, run a series of validation routines to ensure that all of the data that was received was actually applied to the target tables.  Total the input by some grouping; total the output by some grouping; and compare the results.  If things don't match, then send someone off to start digging through the data.

Certainly, that load process was very efficient.  Doing all those block level operations and applying large chunks of updates all at once, rather than doing one-record-at-a-time processing gets the data loaded to the warehouse much more quickly.  Bulk operations, reducing some of the transaction management overhead and improving I/O throughput, definitely do that.  Don't forget, though, that in order to feel truly confident that the processes completed successfullly, you have to read all of that data a second time to balance out the totals.

A slower alternative is to process the data more sequentially and validate that a single row makes it to each intermediate staging area or directly into the target table.  Create a process that watches every row get transformed and loaded, and reports on exactly which records failed to load rather than merely telling you that the final insert/update somehow came out 10 rows short of what was expected.

So, it seems to me the options are these:
  • "Fast" - load in bulk very quickly; do batch validation; if totals don't match, kick out for research and troubleshooting.
  • "Slow" - process each row more methodically and kick out exactly which rows fail any piece of the process.
There are complexities to this choice, clearly, and they depend on things such as the level of complexity in the ETL, the size of the batches, the available windows for processing, system usability during processing, etc.  The most important "slow" lesson is to examine the situation you're in and make a rational decision about how to process data and validate it is loaded correctly.  Don't make snap-assumptions about the right way to ensure data integrity on any particular process.

Wednesday, January 20, 2010

BI for MBA students

I've got the opportunity in 6 weeks to guest lecture for one eventing to an MBA class about Business Intelligence systems.  The course is a general Information Systems class that the MBAs are required to take, so I'd like to make sure the conversation does an effective job engaging them and helping them understand the role systems and technology play in supporting decision making.  Any thoughts or advice on important things to make sure I cover in 2 hours?

Post a comment or tweet @paulboal.

Tuesday, January 19, 2010

Stand Slow

As the title of this blog shows, I'm a big R.E.M. fan.  They're great performers, and I've always loved their lyrics.  Thinking about what it means to me to slow down, the lyrics to "Stand" seemed appropriate.
Stand in the place where you live
Think about direction
Wonder why you haven't before
Now stand in the place where you work
Think about the place where you live
Wonder why you haven't before


If you are confused check with the sun
Carry a compass to help you along
Your feet are going to be on the ground
Your head is there to move you around


If wishes were trees the trees would be falling
Listen to reason
Season is calling


Stand in the place where you are
Stand in the place where you are
Your feet are going to be on the ground
Your head is there to move you around, so stand.
I want to highlight a few of the lyrics in the context of what it takes to do effective technical analysis and design activities.


"Stand in the place where you live"
Start the work at hand by examining it from an internal perspective.  Look at the data, process, or system from the context of someone who really knows it well.  Learn from them some of the intricacies of how it's put together and how it behaves.  These are the people who really live with the system day in and day out.

"Stand in the place where you work.
Think about the place where you live."
Now look a the data, process, or system from the perspective of someone who is a user or consumer of it.  Get their perspective on how things look from the outside of day-to-day life of supporting and maintaining the thing being analyzed.  The important this is to get outside of your own skin and try to look at things from alternate perspectives.

"If you are confused, check with the sun.
Carry a compass to help you along."
Remember that the title of this song (and every third word) is "stand."  We need a vision and guiding principles in order to know that, as we dig deeper and deeper into the weeds of a project, we don't lose sight of what the driving purpose is.  Remember, when you get down into the weeds, it may seem easier to follow the goat path that someone else already pushed down rather than following the sun (vision) and your compass (guiding principles).

"Listen to reason; reason is calling."
As Jim Harris (ocdqblog) reminded us at the beginning of the year with, "Shut Your Mouth," there's high value to be found in learning to listen.  It can be difficult to know how long to listen before starting to draw conclusions and respond.  It's good advice to to listen even more closely as things seem to sound ridiculous.  For example, I was in a conversation once that described how a payroll process worked.  I was flabbergast by the amount of special logic that went into the business process.  It seemed ridiculous.  Then, I listened harder and further into the conversation (and asked some open-ended questions) to really understand.  As I understood more and more about what I thought was a ridiculously complex process;,

"Your head is there to move you around."
Key emphasis on head.  You should make decisions based on the thoughts in your head, not on your gut feels running around in circles.

Thanks to Stipe and company for the song!

Monday, January 18, 2010

Slow and Agile

I first read the Agile Manifesto nearly 8 years ago.  Anyone familiar with agile principles understands that one of the underlying goals is to produce better solutions, faster, with less wasted energy along with way.  Anyone familiar with introducing agile practices to an existing team knows that agile works best with strong individuals.  Read the manifesto with that in mind (their emphasis, not mine):
Individuals and interactions over processes and tools
Working software over comprehensive documentation
Customer collaboration over contract negotiation
Responding to change over following a plan
I want to be clear - I'm am a strong support of the agile philosophy.  For those of you out there who, like me, would rather spend your time interacting with a computer rather than a human being at work, I want to to point out that agile relies heavily on human-to-human interaction and a high degree of trust in team members and stakeholders.  Key words: individuals, interaction, collaboration.


This is where being slow becomes so important.  Clear communication requires both the sender and receiver of the communication to be very intentional.  "Need the system to do X."  "Got it!"  Is not an effective interaction.  Both parties have to slow down the pace of their interaction in order to be intentional about what they're communicating and make the practices effective.



Speakers:
  • Studies show that we have to repeat ourselves in order for effective comprehension to take place.  Tell the listener what you're going to say; say what you're saying; tell the listener what you just said.
  • Look for comprehension, both verbal and non-verbal: head nodding, eye engagement, note taking, clarifying questions.
  • Provide opportunities for questions.  Some listeners may not be great at asking questions up front, so give them ample time.  Pause after significant points or topics that may be unclear.
  • Validate the receiver's understanding.  Ask for comprehension about specific points, not just a general "does everything I said make sense."
  • Listen and watch for when the receiver flinches, raises an eyebrow, takes extra notes, or tries to break into the conversation.  What you think is already clear is the mostly likely place for miscommunication with someone unfamiliar with the topic.


Listeners:
  • Listen actively.
  • Confirm the assumptions that you have about the topic going into the conversation.  You and the speaker may be starting from different places.
  • Validate assumptions you have with what the speaker is telling you.
  • If something the speaker says doesn't sound right, that's a clear sign that there's some misunderstanding going on.  Either one of you is actually wrong in the information (in which case, facts need to be validated and clarified for both parties) or the communication is not working effectively.  In either case, ask questions and clearly state your assumptions in order to clarify.
  • Repeat what you're hearing back to the speaker in your own works.  Become the speaker and look for their honest confirmation that you've understood them.
  • State any conclusions that you've drawn from the conversation.  They're likely based on both the information in the conversation as well as other unintentional assumptions.
  • Confirm the action items, changes, or impacts of the conversation.  Describe what it is that you think this conversation means to the project; and ask the speaker that same question.
Agile is pro-change (so to speak) in that it doesn't fear changing needs.  I believe that the message in that point of the manifesto is to encourage teams not to fear change and to be willing to build what is needed rather than what is written in out-dated specifications documents.  The key there is that the team is working to build what is needed.  Whether we're building from technical specs or from personal interaction, effective communication and collaboration are required.

So, make sure you take communication slowly and intentionally.  Emphasize communication performance, not just responsiveness.[1]



[1] As a little more detail on that footnote, I think the Wikipedia entry for "responsiveness" makes a great point about the difference between performance and responsiveness.  In that example, the point made is that for usability, it makes sense to put the mouse driver at a higher priority than background processing so that the user experience is more pleasant.  In an interview, though, the background activity of comprehension needs to have plenty of processing time.  The verbal back-and-forth of the conversation can take a decreased priority to ensure that good understanding is happening.

Friday, January 15, 2010

The Power of Slow

I spent part of my work day today doing mid-year reviews with my staff.  For us, it's merely a mile marker in the year, not any large event.  I meet with each member of my staff on a weekly basis formally as well as in ad hoc drive-bys throughout the week.  It works well to maintain that consistent and continuous communication.


During one of these 1-on-1 meetings today, one of my architects was voicing some frustration about the attitude a few project leads were taking with him.  They were pushing the team to just hurry up build something.  Whatever they could do quickly that worked.  He was saying that he had a hard time describing to these project leads how a touch more patience and even a different attitude about the definition of progress would help them see the risk in their approach.

I mentioned to him that I've been slowly reading Carl Honoree's The Power of Slow since last summer.  Only in that moment did I realize there was a way of expressing "slow" to addictively "fast" people.  Slow isn't about taking more time to get things done.  Slow is about the internal pace at which things are done.  From a software architecture perspective, slow doesn't mean spending months in design before anything is built; it means taking the time as your write each class and method to think through how best to write that particular class or method.  It isn't about wastefully over-engineering the solution to be something that will "be scalable into the future for unknown other uses" while the immediate problem is left unsolved.  It's about being methodical and intention in each action we take.  That creates a slower appearance, because the process becomes more efficient and there's less waste moving up and down the rate of progress curve.




The "Slow" line represents a steady and consistent stream of work.
You can think of the dips in the "Fast" line as any number of things:
  • Hurry up, then wait
  • Work fast, then go back to fix your mistakes
To reach the 100% mark at the end of the chart, both approaches may get there in the same amount of time, but the "Fast" approach consumed a lot more effort.  Effort being the length of the line itself.


One of the fears that a "fast" people have, especially "fast" managers, is that "slow" implies developing ultra-sophisticated solutions, most of which will never actually be used; over-engineering; or architecture-creep.  For some "slow" people it probably does, but the right kind of "slow" is merely a thoughtful, methodical implementation that gives everyone enough time to make sure they're doing the next step in the project correctly.  I may feel slow, but that's only because it's efficient.


Ironically, we see this in the world around us all the time.
  • When starting your car from a stopped position on slick slow or ice, you have to do so slowly or the wheels will merely spin.  In both cases, you'll move forward some, but when accelerating too quickly, you'll burn a lot more fuel and tear on your tires.
  • In cartoons, we see the antagonist rushing around in a hurry from place to place, rushing; while the protagonist slowly and thoughtfully moves through the chaos and reaches the goal first.
  • In action/comedy movies, we see a brash martial artist execute a flurry of activity in a dramatic attack, only to be stopped by a few small simple motions of the hero.
So many people incorrectly think that looking productive is the same as being productive.

In my own writing, my wife points out to me that there is power in brevity.  Sentences should say all that they need to say and no more.  I tend to add too many unnecessary adjectives, or flourishes, or sentence complexities.  More words often doesn't imply more meaning.

Tuesday, January 12, 2010

The Role of Health Informatics

Interesting question that's always puzzled me is what is the differentiation between the term "informatics" and "information management."  In my admittedly limited experience, informatics is used primarily in scientific and medical fields, such as "health informatics".  Information management is a more general business term.  Why different?

For one thing, I suppose, the etymology of "informatics" explains part of it.  The "-ics" ending means "the science of."  So "informatics" is the science of information rather than the management of information.

The Wikipedia article defines "health informatics" as having these key aspects:
  • Architecture for electronic medical records and other health information systems
  • Clinical and other health related decision support systems
  • Standards for integration
  • Standards for terminology and vocabulary

Food for thought - In the health care industry, does it make sense to co-locate business roles like master data management, data quality, information integration, and business intelligence with the traditional informatics departments; rather than building them within business management units or IT/IS units?  Does asking health informatics to work with other non-clinical business functions somehow risk or distract from patient care responsibilities?

(As a side note for those across the pond, it turns out that "business informatics" is a European term, closely related to "information systems" study, but with some subjective differences pointed out in the Wikipedia article.  It is not a term commonly used in the U.S.)

Friday, January 8, 2010

Organizational Optimization


I'm not an expert in business organization design or organizational optimization, though have been an active decision-maker in a few organization redesigns and layoffs (I'm sad to say the latter).  It seems to me that the rules of what a business looks like, organizationally, could follow some of the same basic guidelines that enterprise architects and software architects use for the design of a large system of applications.

Clarity of Purpose
A business unit must have a well defined purpose.  It should be clear to the rest of the organization what the function of the business unit it is, so that other business units know when to engage with it, what information it has/creates/uses, and what purpose it serves.

Well Defined Interface
A business unit must be clear to the larger organization on how to interact with it and expectations the organization may have with regard to service completion time or delivery schedule.


Value
A business unit must be able to readily respond to questions about its value to the bottom-line performance and goals of the larger organization.  If it's value is not sufficient to outweigh alternatives, then it is irresponsible to continue to use it.  The lower cost alternatives should be used.

Reliability
A business unit should either be self-sufficient enough to achieve its purpose alone, have plentiful access to external services such that they are unlikely to be a limiting reagent, or have sufficient influence over the priorities of external services such that they will not hinder progress.  If the business unit is unable to make those conditions occur, then it will suffer from inefficiencies and risk not being able to achieve it's goals.

I think that's very consistent with system/software engineering principles laid out in the Unix philosophy.
  • Do one thing and do it well.
  • Write programs that work well  together.
  • Data dominates. If you have chosen the right data structures and organized things well, the algorithms will almost always be self-evident. Data structures, not algorithms, are central to programming.
  • Fold knowledge into data.
  • Design for simplicity.  Add complexity only when you must.
The time-value curve also could be use to help design organization principle.
  • Organizational structure should optimize the amount of time between an event and the decision making and action that will result from that event (opportunity is lost in the interim).

Tuesday, January 5, 2010

Canned Solutions Canned

Or: 
Why we need more data professionals employed with health care providers rather than health care consultancies.



I'm a member of a health care provider industry association that shares information across organizations about our experiences delivery business intelligence and data warehouse solutions.  Recently, one of the members of that organization posted a forum question about several packaged industry-specific data warehouse solutions that they'd had pitched to them.  Here was my response.  I thought it was worth sharing here as well.

My personal experience (as a BI/DW consultant and professional for the past 10 years) is that prebuilt data warehouse solutions for any industry come in three flavors:
  • Logical-only models that come with an implementation "project."  The physical implementation will be built to suite your organization and usually use language and terminology that your organization is comfortable with.  These are full blow projects.  Though some ideas come in the box, very little runable code is ever included inside the box.  That takes the time and money of professional services.
  • Canned data warehouses / data marts that come as a full blown, predefined solution.  These might seem like a good idea because so much is already built for you.  In my experience, though, they return only a fraction of the business value they seem to promise, though.  The model in the canned solution doesn't typically match your own business model closely enough to be as valuable as it appears during the sales meetings.  Another challenge with these solutions is the integration of data into them.  They're either so high level that they don't have a lot of value; or they're so different from your own organization that the mapping of data into their model is overly complicated.
  • Marketing hype - in some cases, the solutions really are just a marketing message.  In reality, the vendor is merely saying "We have some people from health care and we have some BI skills.  We've put them together on a team for you."


That's not to say vendor solutions aren't a reasonable way to go for you.  When you talk with them, be very specific in your questions and very critical.  Ask to talk with other client references that are similar in size, systems, and market to you.  Do a proof of concept to implement some small portion of the solution free of charge.

I hope I don't sound too harsh on vendor solutions in that response.  I don't believe that vendors intentionally oversell their solutions.  In some cases, they're ignorant as to what it takes to really build, deploy, (and most often) run, maintain, and enhance a data warehouse for several years after the initial implementation.  This was one of my personal realizations as I transitioned from consulting into a corporate job.  Other vendors may simply being putting the cart before the horse, and using consulting opportunities to build up their solution offering.  No harm in that as long as they're upfront and honest about it.

On December 30, 2009 HHS presented the much anticipated details of the definition of meaningful use of electronic health records.  This definition provides more information about the level of functionality and use health care organizations must achieve by certain deadlines in order to receive certain types of government assistance and maintain the highest levels of medicare reimbursement.  (I have not read the details.)  Soon there after @theEHRguy tweeted a couple of great predictions:



I think it would be better if we paid those of us who actually work for health care organizations $450/hr.  Unlikely.

Still, I think he makes a very good point about consulting services, and I'd extend that to certain types of packaged vendor solutions by nature of the services they usually imply.  The high demand in the area of health care systems right now is going to be shortly followed by a high demand for BI/DW solutions in the same space.  In the rush, I think the industry risks implementing poorly designed packaged vendor solutions will result in a lot of wasted time, money, and effort; and lost opportunities for growth and optimization.  Time will tell, but I think it's a long road ahead.

Sunday, January 3, 2010

Who's data is it? (Part 4) - Counter-Point

The past three posts in this series have dealt with negative behaviors or attitudes about the responsibility of data ownership:
  1. It's the vendor's data
  2. It's MY data
  3. It's NOT my data
As I mentioned a the end of part 3, an important factor to consider when looking at any of those situations is the history behind how the team or organization came to be where they are.  Every organization forces behavioral pendulums to swing back and forth between extremes in response to negative outcomes.  What might appear to currently be a negative behavior may have once been a reasonable response to some other negative experience.

For instance, "it's the vendor's data" may well have developed in response to some historical data conversion or data integration project that well over schedule and over budget because data structures were poorly understood by the internal team and the use vendor services was discovered, too late, to be a huge benefit.


Likewise, the "it's MY data" crowd might come from an application support team that has come to understand the value of information integrity and master data, and has developed a sense of protectionism from those goals the hard way: by failing to meet past goals.  "We're responsible for maintaining CUSTOMER data for the company.  If someone else starts using that data without our close direction, then they'll misunderstand what they're looking at and misuse the data."

The philosophy behind this perspective is very reasonable, but the implementation becomes one of closing off access to information rather than increasing and easing accessibility and educating users and other teams on how data should be used.  Rather than maintaining institutional knowledge exclusively within a support team, how to use key information should be exported to the larger enterprise.

People who are focused on executing business processes tend to examine technology and information management from a localized context: what information does this  process need as input and what applications will this process use?  This siloed view lacks the concept that a great deal of value comes from examining the space in between business processes or applications.  In the space between various business applications, there exists an opportunity to gain a higher-level of understanding about information interrelates between those various silos.

Conclusion:
Thanks for joining me on this conversation about data ownership.  Personally, I detest the word "ownership."  It has negative historical connotations that somehow suggest to me "ownership of data" implies the "enslavement of data."  But I struggle to find a more appropriate way to refer the concept that someone has certain responsibilities when it comes to managing information.

Perhaps it's a bit trite, but to borrow from the famous Native American proverb about the earth:

We do not inherit data from upstream systems; we provide information to downstream ones.

Friday, January 1, 2010

Who's data is it? (Part 3)

In the first two parts of this series on negative attitudes on data ownership, I talked about protectionists who want to keep tight control over who has access to what data.  The most important negative consequence of that attitude is that it stifles both understanding and innovation. Fewer people looking at data means that the data is being examined from fewer perspectives.  Innovation comes from looking at existing situations or information from a new perspective and reaching new conclusions.  So, one of the best ways to confront data protectionists is to put the argument in terms of that business driver: innovation and resulting improvements.

It's NOT my data...

The final negative data ownership pattern I'll present in this series is the "it's NOT my data" attitude.  In this scenario, people within the information supply chain simply abdicate responsibility for anything regarding information, beyond their immediate operational duties.  This can manifest itself in several ways.  With respect to data quality concerns, the "not my data" crowd will simply point their fingers upstream:
  • "That's what's in the system."
  • "That's what the sales team entered."
  • "That's what the customer told me."
All of these are only excuses for poor data quality.  By themselves, they do nothing to address data quality problems.  These are reflective of a culture that doesn't look beyond the operational duty of data entry and service or product delivery. 

A more subtle manifestation of the same attitude might look something like "You can have the data, but don't ask me what it means.  You'd have to ask so-and-so about that."  And when you ask "so-and-so" he directs you to "what's-her-face," who tells you to ask her supervisor, who explains that they're just doing it the way the Standard Operating Procedure tells them to.  In this chain of inquiry, everyone is abdicating their own responsibility to understand how their work fits into a larger picture.  It's a form of willful ignorance.

In his motivational work, Christopher Avery uses a model for responsibility that describes several responses that all come before actually taking responsibility for a situation:
  • Denial
  • Lay Blame
  • Justify
  • Shame
  • Obligation
In the "not my data" culture, groups are stuck in one of those first three levels: denial, lay blame, justify, or shame.  They either deny that there is anything to even consider with regard to data ownership and data quality; they lay blame on other constituencies that are upstream from them (even all the way to a customer!); or they justify the situation, telling themselves that there simply isn't another way that things can be done.

The engineer in me sees a certain appeal to "not my data" situations.  In the extreme, they're a great challenge in reverse engineering.  Every steps yields more questions than it answers, and opens new places to explore for broken processes and more denial of responsibility.  It's like a software debugging exercise that leads you into deeper and deeper through twists and turns of function callbacks until suddenly you discover that critical nugget of information that finally allows you to describe the end to end flow of information, and the real impacts of poorly implemented processes or policies.

The real-world corporate director in me sees situations that need to be addressed, supervisors to be educated or replaced, policies and procedures to be changed -- all great places for change and growth -- but also, much less enthusiastically, politics to be navigated.

Counter-Points...
There are always multiple perspectives to any situation.  Teams and organizations don't usually evolve what may appear as negative attitudes out of ignorance or malice alone.  Often, other negative influences or behaviors lead to a culture that once served a valuable purpose, but may no longer yield more benefits than harm as the previously negative influence dissipate.  The final in this series will present situational counter-points to some of the arguments I've presented.