Monday, September 25, 2006

Missed time to market window! Really?

We know project management is all about the juggling the three balls of time, cost and quality. A project is successfull if it meets the functional and non-functional requirements within predetermined time, cost and quality constraints.

The traditional project management approach (and hence 99% of the tools) focus on completing the defined work within given time constraints and cost limits. However, the recent focus has been shifting more to the quality of the final output!!

Let's look at some examples:
  • Google. Didn't google missed the time to market long before it released its search engine?
  • Apple iPOD. Had it made a difference if iPOD was delayed another 6 months?
  • Toyota Prius in 2000. Missed the TTM by three years! (Audi released its first hybrid in 1997 and Honda released its hybrid in 1999)
More and more companies are realising the fact that quality rocks!! If your product is high quality, it doesn't really matter if you are a year or two late to the market. Every product has its life, but if it is of high quality it tends to live longer - which changes the whole Net Present Value (NPV) calculation, in case you are using it to calculate the validity of your projects and releases.

Project requirements can be divided into functional and non-functional buckets. Functional requirements are the core (and supplementry) features of your product. Non-functional requirements are the systemic qualities, which encapsulates all "illities" - Availability, Scalability, Reliability, Flexibility, Extensibility, Interoperability, Compatibility, Testability, Understandability, Load and Performance, Stability, Resiliency, Manageability, Mantainability, Security, Supportability, Adaptability, Configurability and Usability! Note: Not all illities are applicable to all product offerings.

Your product may have over thousand functionalities, but just pick a handful of core ones (maybe 3 to 5) and all of the non-funtional requirements for your first release. A high quality product markets itself: word-of-mouth is the most effective marketing tool. Once a customer is hooked-in, slowly roll-out new features. That way you'll have the relationship going and you can get a continuous inflow of money - easy from SEC's perspective and no hassle of accounting manipulations either! That's what is driving software as a service (SaaS) market today.

SOA is the SaaS enabler and it is changing the way software is released. SOA brings business agility. However, our project management tools are still old-fashioned. Project managers are still focused on TTM and CTM concepts. They are still chasing deadlines and pennies. Quality awareness is forcing ALM companies to come up with more sophesticated tools that stitches the SOA fabric.

For innovation and quality, you are never late to the market!

Wednesday, September 20, 2006

Follow on: Revisiting the Definition of Software Quality

Interesting article and discussion on the definition of quality on StickyMinds.com! Article is dated back to 2001, but it is still very much relevent. Robert L. Glass has done a good job in defining what quality is and what it is not. As you can read through the comments, not everyone agree with his definition - as one would expect. Quality is a FAT word and can be interpreted in zillion of ways. It is therefore important that the project team agrees to one definition of quality and stick to it. Consistency is far more important than the definition itself.

ISO definition of Quality: The totality of features and characteristics of a product or service that bear on its ability to satisfy stated or implied needs. (ISO 8402: 1986, 3.1)

This definition captures both funcional and non-functional requirements. And BTW, the official name of all "illities" is Systemic Qualities. And there are a lot more Systemic qualities than what Robert has mentioned - for instance - interoperabiliy, availability, scalability, etc.

Another point I disagree with Robert is that "customers/users must participate" in prioritizing and selecting "illities". Some of these systemic qualities are customer facing and other are company facing. For example, it is in companies best interest to make sure there is flexibilty in the code for future expansion and understandability is important to the company for maintenance purpose! Customer doesn't care if your code is moduler and your architecture is flexible. All he cares is the feature set he wants, when he wants it. Customer cares less about the business requirements. But when we talk about quality, all requirements come into picture:
  • customer requiremens
  • business requirements (capture market requirements and corporate requirements)
  • legal requirements
  • government requirements
  • social requirements
  • testability requirements
  • operations requirements
  • engineering requirements
I don't think we need to be in accordance with the customer on all these requirements!!

Another interesting topic that was raised in the article is whether quality can be quantified, given the definition by Robert Glass. I find it rather amusing because, I think, Quality can be defined and can even by quantified. Of cosurse, not everyone would agree with your definition and your way of quantifying it, but you can definetely do it. And as I said, consistency is far more important than the definition itself.

Read Quality Index (QI): Measure of Risk for more insight into how you can measure software quality

Friday, September 15, 2006

It all boils down to Metrics!!

Setting up a goal is one thing, but how do we know that we have achieved our goal? Software engineering is becoming more of an art than science. Success is a relative term! A project manager with exceptional artistic and articulative skills can sell a project, which is on road to failure, as a successful investment to the executives. In absence of real numbers, the darkness prevails. And under this darkness, all decisions lead to the path of failure.

Snapshot:
"We have a GA date approaching. PPM calls a Projet-Team meeting and takes a vote of confidence, which decides the fate of the software!!"

"A P1 bug is not a show-stopper if it already exists in production. The release will not degrade the production quality!!"

"QA gives a conditional GO with list of risks. By the time decision propogates to executives, the attachment is dropped and the Conditional-Go turns into a Sure-shot-Go!!"

"PM: The problem is not in our piece of the code. The issue is because of the other component that we are dependent on!!"

Sounds Familiar? Interesting, isn't it?

Let's face it, we need sophisticated tools that can generate real time metrics for anyone to make informative decisions. People often mix product quality with process quality. Even though a high quality process generates a high qualiy product (TQM principle), I believe the metrics for the two should be different. For example, higher percentage of test automation improves QA process quality and doesn't directly improve underlying product quality! Note: the automation of processes in the early ALM cycle would have a more direct impact on the product quality.

Here are the list of questions, that metrics should be able to answer:
  • Q1. What is the overall Quality Index (QI) of the product. QI for a particular feature or requirements? What is the QI of different components?
    • Consistency of the processes and measures is the key here.
    • It is easy to fabricate a QI model that concentrates on intrinsic product quality!
  • Q2. What's our Release Readiness? What's the risk if we release our product today?
  • Q3. What's the QI trend for differenet releases and different builds?
    • Trend is more important than actual QI snapshot.
    • Errors in the QI (if any) cancel out when you read trends
  • Q4. What's the Dependency Matrix (DM)? How does other SOA components impact my product? How does my product impact other offerings in the organization?
    • Current snapshot from QI perspective
    • Roadmap overlap for future releases. Cross-project Backlog Management.
  • Q5. What's the realtime Coverage graphs?
    • Test Coverage (test validating requirements)
    • Code Coverage (tests validating code)
  • Q6. Testing Strategy Automation.
    • When files A, B and C change, which features get impacted. What test cases and configurations should a test lead plan for next build?
  • Q7. Process Quality. How productive is my team?
    • Measures of test automation.
    • Comparisons with baseline (and manual testing)
With above metrics in hand, I can easily make statements like:
  • We are ready for the release!! Our product QI crossed 85% in the last build.
  • Because of TTM (time to market) pressures, we have decided to release our product with 65% QI. To mitigate the risk, we have also decided to increase our customer support resources.
  • Our product is not ready for GA because we have a dependency on products B, C and D, and product B has a QI of only 30%. Since B is tightly coupled with our core, we are not in a position to release our product.
  • I can effecively utilized my QA resources to concentrate on only the impacted features in a build. We don't have to regress every build every time. We can validate a build with handfull of fixes in less than two hours, and that too with over 95% confidence!!
  • We can now sell SLAs and QLAs around certan metrics because we have a consistent (and automated) way of capturing them.
  • I can trace a customer escalation all the way back to requirements, because we have end-to-end integration of ALM tools with excellent search facilities.

Friday, September 08, 2006

Role of a Quality Architect

I found a really interesting article on Application Quality by Allen Stoker. Make sure you read both part 1 and part 2. Best part is that it discusses the need of a Quality Architect:
"Quality begins in the team - not the application. Proper planning, communication and processes are essential to any successful project. Projects that lack these fundamentals will likely produce problematic applications. I'm a firm believer that large teams with diverse skill sets need a Quality Architect - a highly skilled technical person on your team who has no assignment but to support or ‘enable’ the other team resources. Such a resource can mean the difference between project success and failure."
This is even more interesting to me because I spent some time last year just to understand the role. I would agree 101% with Allen that this role can make a world of difference and can be responsible for a project's success or failure, especially in light of the fact that quality is the measure of success! (assuming quality is part of the defined business goals)

The Role of a Quality Architect:
  • Get the business, engineering, and QA teams to agree on common quality goals (i.e. define quality!)
  • Establish QA infrastructure to boost team's efficiency and effectiveness
  • Establish processes that complement the tools and provides end-to-end traceability
  • Review product architectures and provide feedback on systemic qualities before development cycle starts
  • Understand the ALM process and idenify risk elements from quality perspective
  • Standardize processes and procedures to be able to develop SLAs and QLAs
  • Work with cross-functional teams to combine elements of project management and business analytics, especially w.r.t. SOA interdependencies
  • Translate quality metric data into information! Enable inuitive reporting to drive transparency into product's intrinsic quality.
  • Participate, Review and Approve testing strategies
The role touches almost every aspect of ALM , i.e from requirements to requirements. Horizontally, the Quality Architect is responsible for coordination and collaboration across cross-functional teams (from marketing to design & development to QA to operations & customer care) Vertically, the person is responsible to boost team's productivity and at the same time explain quality metric data to executives in layman terms.

The role requires a fine balance between extraordinary people skills and hands-on technical skills!

Trackback URL: Role of a Quality Architect

Thursday, September 07, 2006

Follow on: Development vs. QA - Why disagree?

I like the bold questions raised by jason in his blog "Development vs. QA - Why disagree?". For the last half decade (especially after dot-bust), quality has gained an overwhelming visibility in the software industry and the awareness is growing day-by-day. Numberous studies have proven the exponential relationship between the life of bug and associated cost. Sustaining costs are increasing far beyond original development costs. Therefore, companies are trying to crush new bugs, as soon as they find their way into the code. And hence, the pressure is on developers to test their own code!

I don't see the testing goals between development and QA as conflicting. I see the conflict more because of differences in role, availability of test beds, and more importantly the will! Developers generally don't want to do testing - they always write the perfect code!

The QA is way too on the other side of the wall. 99% of QA teams are involved in black-box testing of features as customer sees them. So, Quality organization is always more close to the end-customer as compared to the development.

To me - both practices are inefficient. We all know that by testing in the end, QA cannot build quality in, it's the development team that needs to write a quality code to start with. I am a firm believer of TQM principles and Deming 14 points. To improve quality, all processes must be standardized, engineering principles must be put in place, and there should be tools that ease the adoption of all these processes. Processes without tools create too much work and chaos!.

The solution is to have a QE team, a team that is more close to development (report to the same director, or even the same manager!) , responsible for all the functional testing. QE team can catch bugs early in development cycle and QA team can focus only on non-functional requirements as part of the system testing.

Developers don't want to be QA!! They restrict themselves to Unit testing and some basic functional unit testing. Another complecated issue is the deployment that is generally not automated. Lack of automated deployment breaks down the Continuous Integration cycle and developers' motivation to automate post deployment test scenarios. Using tools like MockStrutsTestCase and CactusStrutsTestCase, developers have started to look into some level of pre-deployment functional unit testing (including in-container testing) - but again, that's streatching the limits of Unit testing, as jason said.

QA is black box. But there is a limit to what QA can test - with limited resources, especially time. Once feature freeze is done and all the code is checked-in, nobody wants to give QA couple of months just to make sure they can complete the test cycle. The complicated nature of software, with all the reuseable code and interdependencies, fixing one bug late in the game has a huge potential to give birth to two or more bugs (it's like the Samuel monster from Helloboy!)

Watch Watts S. Humphrey's video for more info on why testing in the end is a bad bad thing to do.

Trackback URL: Follow on: Development vs. QA - Why disagree?

Security Testing (Application Layer)

There are different layers at which we can test for security - Physical, Hardware, OS, Network, and Application. In this blog, I am only addressing application layer security testing. Therefore, you'll not find items like testing of firewall policy rules, hardened OS, checking for all open ports on every system in the data center, testing of dialup & VPN access to systems, system interconnection vulnerabilities , or Intrusion Detection System (IDS). This blog is just a starting point and does not gaurantee end-to-end security test plan.
  • Authorization: Act of identifying an individual, i.e. it is determining whether they are who they claim to be. This testing includes:
    • Password based authentication
    • Checking against Denied Parties Restriction List (DRPL)
    • Test for unauthorized countries using Reverse DNS (rDNS)
    • Test for Login leakage: Test to make sure that user is not revealed whether the userID was wrong or password was incorrect, in case of authentication failures.
  • Authentication: Act of determining whether a given user is allowed to access a given resource under given circumstances (Role Based Access Privilege).
    • Test that only authorized administrators with the appropriate privilege are allowed to access each administrative function.
    • Spoof testing by logging with one role and trying to access non-privileged administrative function (use URL bookmarking)
    • Test by accessing restricted URLs without logging in.
  • Password Strength. Test for password length and strength, password history, rollover and expiry. Make sure dictionary words are not allowed.
  • Passwords in clear text.
    • Check for hard-coded passwords into the software bits or scripts. Run strings on binary code and look for password tags and strings
    • Check for password in log files (at all log levels),
    • Check for password in client side cookies and hidden form fields.
  • Encryption. Tests to make sure that all form submissions use encryption to ensure that information such as passwords do not transit on network in clear text form.
    • Use snoop to capture network packets and make sure no data is transmitted in clear text
    • Check for SSL Certificates - HTTPS and TLS (for LDAP)
  • Session Management. Act of maintaining a transaction or a set of transactions from a given user. This involves maintaining the context(some sort of GUID) of an original authentication so that a user does not have to provide a password for every submission.
    • Test for automatic password protected locking feature on time out.
    • Logout action must terminate the active session
    • If multiple servers are used, make sure session transfers are secure and work as designed.
    • Make sure when a session is destroyed, it is destroyed across all systems.
    • Test for maximum session limit per user (if there is any limit imposed).
  • User Profile and Privacy. Make sure that company's privacy policy is communicated to the end-users. Any forms which collect personal information must include a privacy purpose statement explaining why the information is being collected and how it will be used.
  • Cookies: Cookies are stored in the browser cache generally to manage session state. These can be permanent or session specific, with the difference that session cookies get destroyed when browser is closed. Since these are plain files, they can be edited by any hacker.
    • Tests for permanent cookies to make sure no user specific information (ID or username or password!!) is saved.
  • Auditing and Logging: Act of checking a set of actions to ensure that they comply with a given set of expectations.
    • Check for information protection regulations, such as Sarbanes Oxley, Graham-Leach-Bliley, Data Protection Act, or HIPAA.
    • Test to make sure security relevant events are getting logged. Events that are logged must include sufficient information, including: Date/Time; System/Subsystem identifier; User/Process ID (if relevant).
    • Logging events include:
      • Number of password guessing attempts,
      • Attempts to use privileges that have not been authorized,
      • Denial Of Service attacks
      • Login Logs.Test to make sure information logged includes the user name, date and time of login, and any privilege escalations that are requested and are granted or denied.
      • Last Login. Test to make sure that at login time, every user is given information reflecting their last login time and date.
  • Web Security Threats
    • HTTP Get vs. Post. Make sure portal submit form data using HTTP Post. If HTTP Get is used, add data is visible under URL, irrespective of whether HTTP or HTTPS is used.
    • Check for password and other customer sensitive data in hidden form fields.
    • Test to make sure that web server is not configured to show directory listing.
    • XSS security threats. Refer to http://sec.drorshalev.com/dev/xss/xssTricks.htm for more details
    • Make sure that hidden form fields don't carry sensitive user information.
    • URL redirections. Test to make sure all form submissions go through HTTPS
Useful Links:
Trackback URL: Security Testing (Application Layer)