Agile vs. Traditional Software Development: Why is the debate still going on? - Part I: The dangers of bounded rationality

Featured
20445 Views
16 Comments
17 Likes

Agile vs. Traditional Software Development: Why is the debate still going on? - Part I: The dangers of bounded rationality

My understanding is that, in practice, successful agilists tend to bring together a number of activities, tasks, and deliverables that are from beyond the boundaries of what may be called “pure agile.” This mixing and matching of software process elements from agile and non-agile (more formal) approaches is a much more practical way of using these methods.
3.0.0
(Bhuvan Unhelkar, Senior Consultant, Cutter Consortium -- in Agile Product & Project Management, Vol. 11, No. 1 -- 2010, Cutter Consortium).

Agile vs. Traditional Software Development: Why is the debate still going on? - Part I: The dangers of bounded rationalityThis article is the first of a series that I’ll be authoring for Modern Analyst. Here, and in future articles, I’ll discuss how hybrid, customized models that don’t follow any “pure” method or approach, agile or traditional, are allowing companies to develop healthier projects, and get slipping projects back on track.

My first topic is Agile’s excessive reliance on user feedback to identify and resolve the requirements uncertainties early in the development process. In this article I discuss how, in agile projects, too much significance attached to user feedback may create significant risks of schedule and cost overruns, and how the adoption of traditional analysis practices may mitigate such risks.

Agilists are constantly talking about how Agile practices support getting things right (or right enough) early on via user testing and feedback. In theory, this makes perfect sense: continually incorporate user feedback so you can quickly discard the wrong solutions and find the right ones. In practice, what I’ve learned is that while early feedback is definitely valuable, it doesn’t necessarily prevent project inefficiencies and risks, and may very well create them, when used to replace sufficient upfront analysis.

I’ll use two projects with different profiles to illustrate my point.

Imagine that Basecamp has not been invented yet, and you are trying to create a Web-based project management application focused on making collaboration and communication easy. Your users know what they don’t like in Microsoft Project, and the team is excited to work together to produce a simpler and better alternative for projects that do not require such a complex task tracking tool.

For this type of problem/opportunity, just-in-time analysis, short iterations, and early feedback will typically lead to good results. It’s not difficult to define and evaluate the functionality that is being produced, such as create a to-do list, assign to-do items to team members, etc. Progress is fast, missing requirements or inadequate design solutions are quickly identified, and what didn’t work can be fixed in no time.

Now, let’s take a look at a different real case scenario. Consider a legacy system that supports a complex business process, and took years to be developed, expanded, and enhanced. The system has multiple integrations with applications that either provide data for the system, or consume information it produces. The new system will have to not only satisfy unmet business and user needs (making the application easier to maintain, simplifying tasks, and providing more user-friendly interfaces), but also entirely replace existing functionality and integration points before it can go live.

As users describe and prioritize what they want (new calculated fields that will reduce the amount of work and data entry errors, user-friendly menus for navigation, etc.), the project starts on a positive note, with working software being delivered every 3 weeks, and feedback being used to make the system better.

Fast forward a few months. The system is not in production yet; due to the size of the legacy application that is being replaced, go-live is still a few months ahead. By then, users performing acceptance tests start noticing certain things that aren’t quite right with features they had previously approved. For example, the user story “as a manager, I want to know which tasks have high priority so I can assign them to the next available resource” seemed fine, until a subsequently implemented feature allowed a user to discover that the calculation used to rank tasks by priority does not work in 30% of the cases. “Not a problem”, the developers say, “scratch the priority calculation! Let’s go back to showing tasks by chronological order and letting the manager manually choose which tasks should be given higher priority, as the legacy system does.”

All would be good if it wasn’t for the fact that this is not the only case of a previously requested and accepted feature all of a sudden proving to be "unfit for purpose" after other parts of the system were made available for testing.

More and more problems start to surface, with user stories previously considered “done” having to go through rework not only once, but multiple times, due to new insight caused by added pieces of functionality interacting with each other. While in the beginning the entire team is not concerned, (“change is good, we embrace change!”), soon the volume of modifications required in code is such that the project is now running the risk of failure due to the impact in schedule and budget.

The most worrisome aspect of this scenario is that the changes requested by users reflected needs that may have been unforeseen, but were not unforeseeable. Changes had nothing to do with the “uncertainty inherent in the world”. Rather than triggered by the need to respond to new market conditions, shifts in business strategies, or new insights about user behavior, such changes could be easily avoided by proper elicitation and analysis of existing requirements. The need to modify previously implemented functionality caused waste and rework that could have been avoided if the requirements had been more clearly understood before the programmers started to code the features [1] .

To explain that change is inevitable, Agile practitioners like to quote Humphrey’s Requirements Uncertainty Principle, which states that for a new software system, the requirements will not be completely known until after the users have used it. The untold part of the story is, that in many complex software projects, unless you put sufficient effort into requirements discovery and analysis, requirements may not be completely known even after users have tested and approved pieces of functionality delivered sequentially.

Why is it so? The main reason the combination of “just-in-time requirements” and early user feedback often fails to uncover missing/incorrect requirements is that system thinking is not natural for most of us. Modern physics teach us that the behavior of systems may not be a direct function of the behavior of the parts of a system, but when users are asked to explain what they mean [2] , or are invited to test a feature, they focus on that particular aspect of the solution, without paying enough attention to the nature of relationships with other parts of the system and to non-linearities that may produce unexpected results.

User input and feedback, in special in early stages of a project, may be seriously affected by bounded rationality--the notion that individual decisions are rational within the bounds of the information available, but such information is not perfect, especially about more distant parts of the system [3]. While first testing the task-prioritizing feature, based on the part of the system they could see and know, users made a reasonable decision that the functionality was fit for purpose. Only after their bounded rationality was enlarged (in the presence of better, more complete information provided by the addition of other features) the conclusion changed.

Agile proponents argue that because Agile methods support changing direction, the cost of being “wrong” is much smaller than in traditional, less flexible methods of software development. This conclusion is only true when you are capable of identifying that you are in the wrong direction early on, so the lessons can be incorporated in a timely manner. Unfortunately, that is often not the case in more complex projects.

Agile's philosophy that analysis can be done incrementally, with work broken down into small, achievable "chunks" of functionality, has proved to be wasteful in many of the agile projects I've joined. While implementing these chunks within short periods of time has been beneficial, allowing users to interact with the solution to verify whether it meets their needs, incremental analysis caused the stakeholders and solution teams to ignore the fact that a system is different than a heap or a collection. To truly understand a solution, one must understand the mutual interactions of the parts of the system, whose characteristics cannot be found as characteristics of any of its individual parts.

In a previous article, I wrote:

"As with plan-driven organizations, mature agile organizations are always looking for ways to improve the performance of their software process. Optimization of the development process may require tailoring agile methods to the company’s needs and capabilities, and adding a talented BA to the team is becoming an obvious answer for many of the struggles that sometimes agile teams face, from competing interests among various stakeholders, to lack of a clear understanding of the problem to be solved with the software solution."

Skilled business analysts are trained to think systemically, analyzing requirements in relationship to other requirements, and seeing each piece of a system as a meaningful component of a working whole. While many agile teams prefer to document the solution after it has been delivered, or perform “just enough” requirements analysis for the highest-priority requirements taken from the project backlog, these strategies not always yield the best results. While it’s true that customers and users do not always know what they want at the outset of a project, and that we must be open to change during project execution, that doesn’t mean that just-in-time analysis is always the best approach to understand the domain and identify what needs to be built.

“Model just for now, you can always come back later” may be a good method to explore new ideas and generate innovative products. However, it is well-known that the later in the development cycle major errors are discovered, the more expensive it is to fix the defects. Being open to change should never be an excuse for lack of systems thinking, poor analysis, and inadequate and ever-evolving requirements. While adopting some of the Agile practices may help increase the value of software projects, the use of traditional business analysis techniques may turn out to be a huge step toward keeping a project on track and avoiding costly, high-risk requirements churn.

Read on: Agile vs. Traditional... - Part II: Methods are a means to the end of project success, not the end themselves

Author : Adriana Beal received her B.S. in electronic engineering and MBA in strategic management of information systems from two of the most prestigious graduate schools in Brazil. For the past 10 years she has been offering consulting assistance throughout the software development life cycle for organizations implementing large, complex software projects. Adriana recently joined e-commerce team at Hewlett-Packard. She is also the founder of Projeto 100%, a movement making significant changes in the lives of families trapped in poverty in Brazil.


[1] It’s interesting to see that even though the new system was replacing a legacy application (and consequently inheriting many requirements from the old system, instead of having completely new requirements), it was still hard for users to keep in mind the interactions that would happen between different parts of the system when all the pieces were put together. There was no significant difference between the requirements churn experienced in this project and in other agile projects I joined that weren’t replacing legacy functionality.

[2] "To discover the details behind the requirement, the developer (or developers on project teams which take a pair programming approach), ask their stakeholder(s) to explain what they mean." Source: http://www.agilemodeling.com/essays/agileRequirements.htm

[3] For more on bounded rationality, read ,Thinking in Systems: A Primer.

Like this article:
  17 members liked this article
Featured
20445 Views
16 Comments
17 Likes

COMMENTS

ajmarkos posted on Monday, October 3, 2011 11:59 AM
Adriana:

Great article! Interrelationships, interrelationships, interrelationships - RA is all about discovering the interrelationships!!! Ohh, did I forget to mention interrelationships?

This point really really needs to be driven home to the BA community. Most analysts fear interrelationships, preferrng to document a "heap or collection" of stand-alone requirements.

Some essential data flow diagrams (they are the ONLY technique that focus on the ESSENTIAL functional relationships), some ERD's, and some screen shoots. That's effective bigger picture Agile requirements analysis! And bigger picture analysis is what should be the main charge of BA's.

Tony




snabulsi posted on Tuesday, October 4, 2011 12:32 AM
Hi Adriana,

Excellent analysis and realistic view of how these methods are used and could be used more beneficially in the project world. While I can see the benefits that agile development can bring, I do feel strongly about the risk management advantages of a more traditional approach as well as better control over the "inadequate and ever-evolving requirements," you mentioned in your article.

Will be looking forward to Part 2 in this series!

Suehaila
adrianab posted on Wednesday, October 5, 2011 2:50 AM
Hi, Tony and Suehaila,

Thank you for taking the time to leave your comments!

Tony: Yes, too often the "bigger picture analysis" is missing in agile projects. I hope more BAs realize there's a great opportunity for them to start adding more value to their agile projects, filling that gap.

Suehaila: Part 2 has been already submitted to Modern Analyst, and I hope you will enjoy it too and leave another comment to share your views when it's published!

dzhou posted on Tuesday, October 11, 2011 7:37 PM
Hi Adriana,

Thank you. A very enjoyable piece of reading. I have somes thoughts that I'd like to share...

1) Agile (e.g SCRUM) focuses on time-boxed deliver so that any change in requirements late at the development phase will be prioritised so that least important requirements will be dropped out from the development scope. This will limit the cost and schedule overrun. The pretext for this is that a) the end user representive (Product Owner) is given the authority to do so, b) the nature of the system under development allows functionality to be dropped out, from both political and commercial perspective

2) Agile should be applied selectively depending on the complexty of the project. My experiences is that on those large scale projects that require mulitple external integration, including interface to systems that are external to the organisation, and that the risks associated with these integration are high then an system engineering (traditional/formal) approach is needed, because I treated them as an Engineering Project rather than simply a software project. For system that have small integration points and relative low in business critically I found Agile is a better/more effcient way to deliver.

3) Agile can not succeed on large scale projects without drastic organisational wide change from top down in building a project governance body aligning with the Agile (e.g. SCRUM) framwork. However in most of the estabished large organisations it is very difficult to achieve because the impact is usually enormous and expensive and it requires strong leaderships in most cases, so a lack of agile-catered governance is one of the reasons that causes an agile project to fail.


Thanks
Derek Zhou

adrianab posted on Tuesday, October 11, 2011 7:57 PM
Hi, Derek,

Thank you for leaving your comments. You are right to point out that in many cases the reason agile projects fail is the lack of organizational / governance support for adopting what can be a radically different way of developing software for a company used to a traditional model.

In the series that starts with this article, I decided not to focus too much on this aspect because lots of authors have already discussed the importance of organizational alignment for a successful switch from traditional to agile development. My intention is to focus more on other reasons--often ignored--why agile may fail (bounded rationality being one example, incompatibility of project profile with a "pure agile" approach being another, discussed in the second article on this series).

I look forward continuing to exchange ideas as this series progresses!

Cheers,

Adriana
Coori posted on Thursday, October 20, 2011 12:52 AM
Hi Adriana,

Been meaning to respond to this thought-provoking piece for a week now, and finally getting to it.

While I agree and am deep sympathy with the notions of systems-thinking and mutual interactions and dependencies, in my experience, good upfront planning in the Agile context makes a big difference. For example, experienced (and I emphasize the word "experienced") project managers, developers and analysts, who've worked together for a while will have a good sense/instinct/intuition about where some of those potentially lethal system interactions will occur, they will plan their Agile sprints accordingly, placing the risker bits upfront to surface these sorts of things. In fact, they will often do sprints (2 or 3 of them) with no user facing functionality that "end users" can interact with, because they're merely working in that early fact-finding mode, sussing out what sort of pitfalls might be looming.

I find that sort of Agile planning and prototyping invaluable to avoiding the sort of errors that you rightfully assert can occur.

Thanks,
Curtis
adrianab posted on Thursday, October 20, 2011 7:39 AM
Hi, Curtis,

Thank you for finding the time to write!

"I find that sort of Agile planning and prototyping invaluable to avoiding the sort of errors that you rightfully assert can occur."

I couldn't agree more. The project I used as an example had decent initial planning/modeling (provided by a seasoned agile team), but that alone didn't help avoid the problem of bounded rationality that compounded as the project progressed. I am convinced from experience with other projects that an early prototype would have helped avoid most of the issues the project faced.

However, the agile teams I have worked with in the past have always been against prototyping (and even spending much time understanding the big picture), insisting on starting right away to produce "working software" in interative/incremental cycles. They work under the assumption that user feedback for the working pieces will be sufficient to highlight any problems early on, which isn't necessarily happening.

In summary, I agree with all you are saying, but to me you are talking about a hybrid model, not pure agile. I've worked with a number of agile methods being implemented by agile specialists and never saw much emphasis in upfront planning/prototyping. While the Crystal family of methods talks about using different models "to meet the project’s unique characteristics", I am yet to see an agile approach effectively address the issue of system thinking I describe in this article.

zarfman posted on Tuesday, October 25, 2011 10:39 PM

Greetings:

You wrote: Agile vs. Traditional Software Development: Why is the debate still going on?

Zarfman writes: Because no one has come up a technique that is recognized as universally applicable to software development. Until that happens if ever, you will continue to hear a chorus of ours is the best - true way.

You wrote: The dangers of bounded rationality.

Zarfman writes: Bounded rationality is a fact of life I don't see it going away any time soon.

You wrote: Skilled business analysts are trained to think systemically, analyzing requirements in relationship to other requirements, and seeing each piece of a system as a meaningful component of a working whole.

Zarfman writes: Systems thinking is alive and doing very well in certain areas. The most
obvious examples are the aerospace industry, where control is required in fly by wire positioning of ailerons, or in the optimal choice of trajectories in space flight. The chemical industry requires good control designs to ensure safe and accurate production of specific products, and the paper industry requires accurate control to produce high quality paper.

I contend that applying systems thinking to an organization/economic entity is in general wishful thinking.

Organizations are dynamic systems with some form of feedback. Unfortunately, its components are made up of groups of individuals which are much harder to control (given politics, self interest, and incompetence) than inanimate objects. Moreover, organizations are highly suseptible to the infamous three part lag. First, one must discover the problem this may take days, weeks or months. Second, one must devise a solution this may take days, weeks or months. Third, did the solution work this may take days, weeks or months. If the solution fails go to step one if the company is still a going concern.

Clearly, systems thinking relative to organizations has a long way to go.

You wrote: You received your B.S. in electronic engineering and MBA in strategic management of information systems from two of the most prestigious graduate schools in Brazil.

Zarfman writes: They must do things differently in Brazil. It sounds like you have one MBA from two different schools. Please clarify or not.

Regards,

Zarfman
adrianab posted on Wednesday, October 26, 2011 12:12 AM
@Zarfman

"Bounded rationality is a fact of life I don't see it going away any time soon."

Does it mean we shouldn't try to minimize its impact in our projects? I say we should and can do well with strategies like prototyping, so people can have a sense of the "forest" instead of just focusing on the "trees" as happens when you merely test the working software produced in a short development cycle.

"They must do things differently in Brazil. It sounds like you have one MBA from two different schools. Please clarify or not."

I have a B.S. in electronic engineering from one school and a MBA in strategic management of information systems from another school. If the description isn't clear, feel free to rewrite and send me your revised version. The current one was written by an American proofreader so I'm not inclined to change it on my own.

"Because no one has come up a technique that is recognized as universally applicable to software development. Until that happens if ever, you will continue to hear a chorus of ours is the best - true way."

In my opinion, that will never happen, because it's simply not applicable. To use a metaphor I don't particularly like (because construction projects have few similarities with software projects), you wouldn't use the same techniques to build a skyscraper and a small shack. In the second article in this series, I write, "Smart companies avoid one-size-fits-all solutions, bringing together practices and techniques from agile and non-agile approaches to create their own hybrid methodologies, adapted to fit each project profile."

Of course, you are free to continue to disagree with everything I write :-). Perhaps you should start writing your own articles, so you can have more space to detail your thoughts on the subject.
zarfman posted on Wednesday, October 26, 2011 6:00 PM
Greetings once again:

You wrote: Does it mean we shouldn't try to minimize its impact in our projects? I say we should and can do well with strategies like prototyping, so people can have a sense of the "forest" instead of just focusing on the "trees" as happens when you merely test the working software produced in a short development cycle.


Zarfman writes: Of course we can try. Bounded rationality is the idea that in decision making, rationality of individuals is limited by the information they have, the cognitive limitations of their minds, and the finite amount of time they have to make decisions. It was proposed by US Nobel-laureate Herbert Simon as an alternative basis for the mathematical modeling of decision making. I would also add that people are irrational and emotional with vested interests.

Even small organizations with 500-1000 employees are complex. They are often composed of management, accounting, finance, IT, engineering, R&D, manufacturing, supply chains, marketing, sales etc. Even practitioners in the foregoing list are not protected from bounded rationality. This suggests to me that a BA is in an even more unfavorable position. As I've suggested many times before how can one not trained in an art or science be expected to produce a competent analysis of that art or science.

Zarfman wrote: Because no one has come up a technique that is recognized as universally applicable to software development. Until that happens if ever, you will continue to hear a chorus of ours is the best - true way.

You wrote: In my opinion, that will never happen, because it's simply not applicable. ...

Zarfman writes: Actually, I agree with you hence the noise will continue. You mention hybrid methodologies sounds interesting, time will tell.

What I'm about to suggest will be viewed as a form of heresy. Dump the BA's into the IT department. Let those skilled in the art or science do the analysis. This would require a basic knowledge of interface design, programming and database structure on their part. Fortunately at the basic level these concepts are not difficult to understand.

Let me explain my bias. I started my career in the aerospace industry with the title engineer even though my undergrad majors were physics and mathematics. The only thing I knew about accounting and finance was reconciling my checkbook. When I acquired my MBA I was required to take management accounting courses. At this point I was in no way competent to solve finance and accounting problems. I took a year off and got a major in accounting. At this point I understood the finance and accounting jargon but lacked actual in the trenches experience. As I progressed though the ranks from accountant to CFO, VP etc. I knew how to solve certain problems or who to ask for outside advice (taxation) if I needed it.

My point is until I had gained actual in depth experience I did not consider my self competent to actual solve finance and accounting problems. To have thought so would have been delusional on my part. I hold others to the same standard.

Regards,

Zarfman
adrianab posted on Wednesday, October 26, 2011 8:43 PM
@Zarfman: We are definitely in agreement that basic knowledge of interface design, programming and database structure is important for IT business analysts (although not necessarily relevant for all types of business analysis activities, which may not involve the creation or change of information systems).

I come from a technical background; my first job, which I held for 2 years, was an engineering position in which I wrote code in C for a modem simulator. Later I learned Java, HTML, Javascript and PHP, as well as the fundamentals of database management. All this knowledge is extremely helpful for an IT business consultant like me. I see other people in the same role struggling to determine the best solutions for a business problem exactly because of their lack of technical knowledge.

"As I've suggested many times before how can one not trained in an art or science be expected to produce a competent analysis of that art or science."

Here I disagree with you if you are advocating that BAs should become experts in the business domain they serve. There are companies specialized in requirements definition that have achieved superior results (based on project metrics) producing the requirements for systems using BAs who were not trained in the specific business domain, but were experts at requirements elicitation techniques.

I also had this experience, of partnering with subject matter experts so they can provide the knowledge about the business problem while I was responsible for the analysis required to achieve the best solution for said problem. I can even say that I had better results when I was *not* an expert in the business domain being addressed, because then you can combine the knowledge of the SME with the analytical skills of an outsider to the field who is capable of looking at problem with fresh eyes and less bothered by the "dos and don’ts" that might prevent insiders from identifying a truly revolutionary solution.

zarfman posted on Thursday, October 27, 2011 10:09 PM

Greetings once again:

You wrote: We are definitely in agreement that basic knowledge of interface design, programming and database structure is important for IT business analysts (although not necessarily relevant for all types of business analysis activities, which may not involve the creation or change of information systems).

Zarfman writes: That's not my point, I contend the SME should possess the basic knowledge of interface design, programming and database structure. If the IT BA has those skills even better.

You wrote: I come from a technical background; my first job, which I held for 2 years, was an engineering position in which I wrote code in C for a modem simulator....

Zarfman writes: Did you also develop the model for the modem simulator, or did you simply program it. There is a huge difference between developing a model and programming it.

Your wrote: ...All this knowledge is extremely helpful for an IT business consultant like me. I see other people in the same role struggling to determine the best solutions for a business problem exactly because of their lack of technical knowledge.

Zarfman writes: I see the same thing in business. BA's struggling to determine the best solutions for a business problem exactly because of their lack of business/technical knowledge.

Your wrote: Here I disagree with you if you are advocating that BAs should become experts in the business domain they serve. There are companies specialized in requirements definition that have achieved superior results (based on project metrics) producing the requirements for systems using BAs who were not trained in the specific business domain, but were experts at requirements elicitation techniques.

Zarfman writes: I'm not advocating that BA's should become experts in the business domain they serve. I suspect that would be an exercise in futility. However, when using an outside resource hopefully someone in the company has the knowledge to know if the out side resource has the skills to back up their sales pitch. By the way I detest the word expert.

You wrote: I also had this experience, of partnering with subject matter experts so they can provide the knowledge about the business problem while I was responsible for the analysis required to achieve the best solution for said problem. I can even say that I had better results when I was *not* an expert in the business domain being addressed, because then you can combine the knowledge of the SME with the analytical skills of an outsider to the field who is capable of looking at problem with fresh eyes and less bothered by the "dos and don’ts" that might prevent insiders from identifying a truly revolutionary solution.

Zarfman writes: You partnered with subject matter experts so they can provide the knowledge about the business problem. What kind of analysis are you referring too after talking to the SME. How does one prove that a solution is the Best?

I Agree that fresh eyes are sometimes helpful. However, in regulated industries does and dont's are very important. These do's and don'ts are state and federal laws and violating them has resulted in multi-million dollar fines. I checked with a few (3) of my friends at a big oil company they have no BA's. They do however, have several floors of lawyers and Quant's.

You make our discussion interesting. Well of to answer another e-mail from someone who thinks I've lost my mind.

Regards,

Zarfman
adrianab posted on Friday, October 28, 2011 6:32 PM
Zarfman wrotes: "That's not my point, I contend the SME should possess the basic knowledge of interface design, programming and database structure. If the IT BA has those skills even better."

Then I disagree with your point, heh. I think it's too much to ask of SMEs to become knowledgeable in the areas you mention. I know from experience that it's not easy to acquire a reasonable level of expertise in these subjects, and I don't see why such expertise can't be provided by someone partnering with the SME to identify the best solution (the IT BA).

"Did you also develop the model for the modem simulator, or did you simply program it. There is a huge difference between developing a model and programming it."

No doubt, there's a huge difference -- that was in my first job after engineering school, so I was not the one developing the model in that case, although I later evolved to developing models in other areas (manufacturing process simulation and such).

"I see the same thing in business. BA's struggling to determine the best solutions for a business problem exactly because of their lack of business/technical knowledge."

In my opinion, BAs sometimes struggle to determine the best solution for the business problem due to their inability to leverage the business and technical knowledge available from the business and technical teams. I know plenty of excellent BAs who without possessing such knowledge were able to achieve high-quality solutions by taking advantage of the available specialists and applying their facilitation, analysis, and problem solving skills. In my experience, it's much easier to learn the business domain than it is to understand the technical domain at a sufficient level during the course of a project, and that's why I think IT BAs with a technical background tend to perform better in IT-related projects than BAs with no technical knowledge.

"You partnered with subject matter experts so they can provide the knowledge about the business problem. What kind of analysis are you referring too after talking to the SME."

Typically I spend several days brainstorming the business need with a group of SMEs and representatives from the technical team, leading the team through a thorough assessment of needs and risks and the best ways to address them. Later I work alone to refine the ideas using a variety of analysis techniques (business rules analysis, functional decomposition, etc.), making sure all important aspects were covered before submitting the business requirements for stakeholder approval. Requirements for a viable solution are derived in a similar manner after the business requirements have been agreed upon.

Talented BAs are good at asking the right questions, so in-depth knowledge of the business domain isn't really necessary to reach a quality blueprint about needs to be created, managed, operated, changed, and discontinued.

"How does one prove that a solution is the Best?"

I don't think any company should be spending time trying to prove that the agreed upon solution is "the best". Perhaps we could rephrase to say a "winning solution" rather than the best. It's easy to identify a winning solution by establishing and measuring against appropriate success criteria.

"I Agree that fresh eyes are sometimes helpful. However, in regulated industries does and dont's are very important. These do's and don'ts are state and federal laws and violating them has resulted in multi-million dollar fines. I checked with a few (3) of my friends at a big oil company they have no BA's. They do however, have several floors of lawyers and Quant's."

Oh, in the past I worked with heavily regulated industries and definitely, the dos and don'ts are indeed more important than features and user experience in such cases. Here's what I believe though:

1) the company your friends work for do have BAs -- they may not hold that title but they are performing the activities of a business analyst.

2) Good BAs can be instrumental in defining excellent solutions for business problems without being specialists in the regulations they need to cover. They will make sure all the business rules that the solution must comply with are correctly identified and unambiguously documented, and they may very well offer novel ideas to achieve compliance goals, borrowing from other fields. Their work will beautifully complement the work of the lawyers and quants. What may be happening in this company is that they have "blended" roles in which the BA is also the SME, but they could be getting even better results if they actually hired analysts with solid skills in business process modeling, functional decomposition, requirements elicitation, etc.

"You make our discussion interesting. Well of to answer another e-mail from someone who thinks I've lost my mind."

LOL! Tell that person that I don't think you are there yet. When you lose your mind I'll stop answering your comments here ;-).
zarfman posted on Saturday, October 29, 2011 9:35 PM


Greetings once again:

I'm guessing the we have a series of semantic and definition difficulties.

You wrote: ... I think it's too much to ask of SMEs to become knowledgeable in the areas you mention. I know from experience that it's not easy to acquire a reasonable level of expertise in these subjects, and I don't see why such expertise can't be provided by someone partnering with the SME to identify the best solution (the IT BA).

Zarfman writes: Let us consider a DBMS, if the SME understands that a table is something like (not same as) an Excel spreadsheet in that it can be created updated, has rows and columns etc. That's what I consider basic knowledge, nothing more. This should be of some help communicating with the IT BA. Note the use of the phrase "identify the best solution".

You wrote: I don't think any company should be spending time trying to prove that the agreed upon solution is "the best". Perhaps we could rephrase to say a "winning solution" rather than the best. It's easy to identify a winning solution by establishing and measuring against appropriate success criteria.

Zarfman writes: New words, agreed up and winning solution instead of best.

You wrote: It's easy to identify a winning solution by establishing and measuring against appropriate success criteria.

Zarfman writes: If its easy to identify a winning solution how do you explain the following; The $5 debit card fee proposed by B of A, other banks are disassociating them selves from this idea customer are happy; Merrill Lynch was saved from bankruptcy by B of A who bought them, then B of A wanted to back out of the deal when they discovered undisclosed Merrill losses; Lehman Brothers Holdings Inc. Before declaring bankruptcy in 2008, Lehman was the fourth largest investment bank in the USA (behind Goldman Sachs, Morgan Stanley, and Merrill Lynch); The list of agreed up winning solutions disasters is almost endless.

You wrote: Here I disagree with you if you are advocating that BAs should become experts in the business domain they serve.

Zarfman writes: As I wrote before, I think advocating that BA's should become experts in the business domain they serve is an exercise in futility.

You wrote: 1) the company your friends work for do have BAs -- they may not hold that title but they are performing the activities of a business analyst.

Zarfman Writes: Sounds like the term business analyst is not well defined, and can mean almost anything one wants it mean blended or not.

You wrote: 2) Good BAs can be instrumental in defining excellent solutions for business problems without being specialists in the regulations they need to cover. They will make sure all the business rules that the solution must comply with are correctly identified and unambiguously documented, and they may very well offer novel ideas to achieve compliance goals, borrowing from other fields. Their work will beautifully complement the work of the lawyers and quants. What may be happening in this company is that they have "blended" roles in which the BA is also the SME, but they could be getting even better results if they actually hired analysts with solid skills in business process modeling, functional decomposition, requirements elicitation, etc.

Zarfman writes: Have fun with this dynamic ever changing beast, Consider Statement of Financial Accounting Standards No. 157 Fair Value Measurements Issued: September 2006. Effective Date: For financial statements issued for fiscal years beginning after November 15, 2007, and interim periods within those fiscal years. Every year including this year the FASB changed or broadened this accounting standard. In fact accounting theorists are still struggling with the problem. I wish BA's (what ever they are) good luck solving this one.

Affects: Amends APB 21, paragraphs 13 and 18
Deletes APB 21, footnote 1
Amends APB 28, paragraph 30
Amends APB 29, paragraphs 18 and 20(a)
Deletes APB 29, paragraph 25 and footnote 5
Amends FAS 13, paragraph 5(c)
Amends FAS 15, paragraphs 13 and 28
Deletes FAS 15, footnotes 2, 5a, and 6
Amends FAS 19, paragraph 47(l)(i)
Amends FAS 35, paragraph 11 and footnote 5
Deletes FAS 35, footnote 4a
Amends FAS 60, paragraph 19
Deletes FAS 60, footnote 4a
Amends FAS 63, paragraphs 4, 8, and 38 through 40
Amends FAS 65, paragraphs 4, 6, 9, 10, 12, and 29
Amends FAS 67, paragraphs 8 and 28
Deletes FAS 67, footnote 6
Amends FAS 87, paragraphs 49 and 264 and footnote 12
Deletes FAS 87, footnote 11a
Amends FAS 106, paragraphs 65 and 518 and footnote 21
Deletes FAS 106, footnote 20a
Deletes FAS 107, paragraphs 5, 6, 11, and 18 through 29
Amends FAS 107, paragraphs 9, 10, 30, and 31
Amends FAS 115, paragraphs 3(a) and 137
Replaces FAS 115, footnote 2
Amends FAS 116, paragraphs 19, 20, 184, 186, and 208
Deletes FAS 116, footnote 8
Amends FAS 124, paragraphs 3(a) and 112
Replaces FAS 124, footnote 3
Deletes FAS 133, paragraph 16A and footnote 6c
Amends FAS 133, paragraphs 17 and 540
Effectively deletes FAS 133, footnotes 9b, 10b, 18a, 18b, 20a through 20e, and 24a
Amends FAS 136, Summary and paragraphs 15 and 36
Amends FAS 140, paragraphs 11(c), 17(h), 17(i), 63(b), and 364
Deletes FAS 140, paragraphs 68 through 70 and footnotes 20 and 21
Amends FAS 141, paragraph F1
Amends FAS 142, paragraphs 3, 19, 23, and F1
Deletes FAS 142, paragraphs 24, E1, and E2 and footnotes 12 and 16
Deletes FAS 143, paragraphs 6, 7, 9, A19, and F1 through F4 and footnotes 5 through 8, 17, and 19
Amends FAS 143, paragraphs 8, A20, A21, A26, C1, C3(d), C4, C6 through C9, C11, and C12 and
footnotes 12 and 18
Deletes FAS 144, paragraphs 22, 24, A12, and E1
Deletes FAS 142, paragraphs 24, E1, and E2 and footnotes 12 and 16
Deletes FAS 143, paragraphs 6, 7, 9, A19, and F1 through F4 and footnotes 5 through 8, 17, and 19
Amends FAS 143, paragraphs 8, A20, A21, A26, C1, C3(d), C4, C6 through C9, C11, and C12 and footnotes 12 and 18
Deletes FAS 144, paragraphs 22, 24, A12, and E1 through E3 and footnotes 12 through 14, 28, and 29
Amends FAS 144, paragraphs 23, A6 through A8, A11, A13, and A14
FAS157
FAS157–1
FASB OP Vol. 2 1679Deletes FAS 146, paragraphs 5, A4, and A5 and footnotes 13 through 16
Amends FAS 146, paragraph A2
Amends FAS 150, paragraph D1
Deletes FAS 156, paragraph 3(c)
Amends FIN 45, paragraphs 9(a) and 9(b)
Affected by: Paragraph 2 amended by FSP FAS 157-1, paragraph 9(a)
Paragraph 36 amended by FSP FAS 157-2, paragraph 11
Paragraph D1 amended by FSP FAS 157-1, paragraph 9(b)
Footnote 2 amended by FAS 141(R), paragraph E4(a)
Other Interpretive Releases: FASB Staff Positions FAS 157-1 and FAS 157-2

Its party time!

Zarfman
adrianab posted on Sunday, October 30, 2011 8:54 AM
"Zarfman writes: Let us consider a DBMS, if the SME understands that a table is something like (not same as) an Excel spreadsheet in that it can be created updated, has rows and columns etc. That's what I consider basic knowledge, nothing more. This should be of some help communicating with the IT BA. Note the use of the phrase "identify the best solution".

Regardless of any differences in semantics you may have identified, it's still my opinion that there is no need for an SME to understand tables in order for a winning solution to be defined. In my experience, that may be even detrimental to the project (a case of when a little knowledge is dangerous). If you haven't worked in an environment in which an IT BA contributes with this type of knowledge during the definition of a winning solution with the SMEs, then I can see it being difficult for you to understand my viewpoint.

"Zarfman writes: If its easy to identify a winning solution how do you explain the following"

Your examples are completely out of scope for software projects, which is the topic of this series of articles, so I'll abstain from commenting :-).

"Sounds like the term business analyst is not well defined, and can mean almost anything one wants it mean blended or not."

You are correct, the term is definitely not well defined and you will find a huge variation in the type of work done by people holding this title. However, I am talking about "BAs" as any people who perform business analysis as part of their normal work activities. If you are interested in what "doing business analysis mean", I recommend reading the BABOK (http://www.iiba.org/imis15/IIBA/Professional_Development/Business_Analysis_Body_of_Knowledge/IIBA_Website/Professional_Development/Business_Analysis_Body_of_Knowledge_pages/Business_Analysis_Body_of_Knowledge.aspx?hkey=d0891e0a-996a-431f-a6f5-a7d644e23a5c)

As much as I appreciate your questions and comments, Zarfman, I am truly swamped with work both for my social project and full time job, so please forgive me if I don't go any further in this discussion. Let's save some debate for the next article in the series, shall we? (It's coming soon :-).

zarfman posted on Monday, October 31, 2011 12:45 AM
Greetings:

Works for me.

Regards,

Zarfman
Only registered users may post comments.

 



 

Copyright 2006-2021 by Modern Analyst Media LLC