My client was situated in a beautiful town called Heidelberg in Germany. Once again I was lucky to have met such wonderful talented people and made some great friends.
The client was basically a consumer price comparison website for house hold gas, electricity and telecommunication. It took off as a startup 10 years ago and has grown to a medium sized family based business to become one of Germany's most popular price comparison website.
As part of improving their software delivery, they hired us to help them implement a pilot agile process. We started off with two simultaneous mini projects which would help them increase the number of people signing up for products in their system and also enhance user experience in general.
Apart from coaching them on new technology and better ways to write code, we also had to impart our knowledge on process improvement.
There were mixed emotions from the team in the beginning. Some of the members were very eager to learn and pick up on new things, some people were approaching it cautiously and some people were reluctant. This is very common at most of the clients and a very natural human behaviour to environmental changes.
Over time, coming across different hurdles and lots of lessons learnt, they eventually started to appreciate the value of our good practices. We also conducted a session called "Why we do What we do" to help them introspect all the practices we threw at them.
One thing I learnt was that if different consultants have different opinions on a certain situation, then instead of confusing and over whelming the client with all the ideas, the consultants should get to a common understanding first and place that idea before the client whilst presenting other ideas just as suggestions.
The last thing you would ever want is internal conflicts within the consultants.
This was a client who had no QAs in their software development teams and not surprisingly had little knowledge about the QA process itself. I introduced the concept of automation testing and the developers initially suggested to use ruby with cucumber to write the tests in the BDD style. But after a while they started facing difficulties learning a new language. It was also not a good practice to have the code base in .NET and tests written in ruby. So I suggested them to use WatiN along with YatFram. It was good fun pairing with the developers in helping them write the tests.
One of the challenges for writing automation tests was that, because it was a legacy code base, it was not in a very testable state. So to write unit tests the developers had to re-factor the code first. But at the same time they had to draw a line as to how much to re-factor. Thus we had to rely on high level automated browser based user journey tests. But the down side is that these take longer time to run on the build. To reduce the build time I recommended to split the tests into two builds; one running the quick journeys and the a second one running the detailed tests. Thus it would give the developers a better opportunity to check in frequently while compromising a bit on the constant feedback.
As time progressed, when the tests started doing their job of finding defects when some one changed some code, then the developers started appreciating the value of the tests and became more proactive in writing them.
I had initially suggested that a round of performance testing had to be done before the first deployment to production based on the changes in code base that we had done. But it was de-prioritised until they realised after go live that their servers were running out of memory. So we had to quickly diagnose and fix the problem after which we ran a series of performance tests on the staging environment. I introduced JMeter for this purpose which worked out well. ANTS was used to monitor the server performance.
As a non-german speaker, I did find it a bit difficult to actually test the website even though I managed it eventually.
The website was very heavily dependant on the database. Interestingly they had 28 web servers and just one database server. Moreover all the validations, static information were stored in the database in the form of stored procedures.
We tried setting up different environments as part of the build process. Though we did not achieve a great amount of success automating the deployment process until the end of the project, we managed to improve it over a period of time. The developers fixed the gaps and defects in the process, I was coming across, as it was growing. It was a bit complex as it involved deploying the code base, database, CSS, third party tools and a CMS backend all separately.
During my time in the project, I got an opportunity to hire a person internally for a QA role and mentored him over weeks. The team was very happy that he picked up his role well enough to carry on independently once we leave.
We successfully deployed thrice over the three month period and delivered as per expectation. We also gave the client a feedback and what they could improve for the future. Overall it was a good project, a great client, an amazing country and people.
Monday, 20 September 2010
Subscribe to:
Posts (Atom)