Showing posts with label EN. Show all posts
Showing posts with label EN. Show all posts

June 10, 2009

Dojo TDR @ XP Day France 2009

This is a late feedback on my session:

+ good attendance, room was full but not overcrowded, great :)

? this was my fifth experiment of this type of dojo, we started the first experiment with a colleague last November; I still had high expectations on the output

+ the subject (Blackjack) was an excellent base to work on, thanks to Gojko whom I stole the idea from

+ thanks to the XP Day goodies, I got a much better timer than I used to have for timekeeping

- I proposed the fitnesse wiki to record the tests, not a very efficient tool for authoring, but that's all I had on my laptop

+ the audience was very dynamic trying to help the keyboard holder

? I started the 5mn turns, intentionally choosing a test that was not at the heart of the problem, just to check whether the participants come back to the core topic, they did not, so I'm not sure that was a good idea

- the participants started quite late to identify interesting tests, actually that was half-through the session when I asked them whether the tests roughly illustrated the Blackjack rules

+ towards the end, one participant got rid of FitNesse and grabbed a pencil to draw some tests on the paperboard, that helped a lot to share an understanding on blackjack rules

+ the conclusion that I made to the audience was something like: "if you want to learn about the domain and the requirements, don't pay too much attention to the rules, they're distracting you from the objective"

+ finally, I received very good feedback about the session from the participants, if you attended the workshop and have some more feedback, please let me know

October 4, 2008

What have you tested today ?


I will be presenting a lightning talk at the Valtech Days 2008 in Paris, the 21st of October.
It will be a "refactored" version of the lightning talk I made at the aa-ftt workshop the 4th of August just before the Agile 2008 conference. There is a low-fi shooting of this talk on google video, thanks to Elisabeth Hendrickson:



One of the attendees also wrote a blog post referring to the talk.

Since then, I went on thinking more and more on how to present these values and the rationale behind them. One of the consultant in Valtech also found a pretty good name for this talk: "What have you tested today ?". I think this question nicely wraps up the 3 values that I emphasize. There is the name of my lightning talk! Come the 21st of October to know more.

August 9, 2008

The material of the TDR workshop at Agile 2008

I'm posting here the (electronic) material that I created for my workshop at Agile 2008, Test-Driven Requirements: Beyond Tools. Feel free to reuse it to play the game yourself, I just want you to keep me informed of the output.

Game 1:
I created this game after an experiment in which I participated at the University of Linköping (Sweden) back in 1999. I found the concept interesting enough to make a parallel with software engineering. I named the 2 main players "Product Owner" and "Developer" and I added a couple of roles, the User and the Tester, to add a bit more spice to that game and make it closer to software product development process.
To play the game, you need 2 identical sets of wooden blocks, a table, and a splitter.


Here are the instructions of the game:
- Audience instructions
- Players instructions

Game 2:
I completely made up this one (or think so until someone tells me it's an old game they played at Agile 1975 ...). In this game, groups of 3 to 9 persons try to uncover the requirements of a vague specification that a user (one of the group's member) had read just before. The objective is not so much to uncover real requirements or needs, but rather to use tests to explore the needs and formalise a specification. The users don't know anything concrete, so the specification should be elaborated as a result of a group collaboration.


The game starts with a short presentation reminding the audience about tools and techniques for using tests as specification. Then, one user in each group is given a Memo to read to get to know the context of a system on which the group will work. I allow the group to work 20 minutes and then ask them to write the result of their collaboration on a sheet of white paper. I will post soon the output of the workshop (I have to upload a report on the Agile2008 submissions website).

July 31, 2008

Agile 2008: Presentation of Examples stage

Here is a podcast where Adam Geras presents the Examples stage of Agile 2008. He talks about my session towards the end of its speech. Thanks Adam.

You will also find podcasted presentations of the other stages here.

July 29, 2008

Preparing Agile 2008: TDR workshop

Yesterday evening my colleagues and I organized at Valtech office in Paris a rehearsal of my workshop for Agile 2008. This event was open and announced on the XP France web site. My colleagues took this opportunity to videotape the session and we invited a production company to do the same, so hopefully we should post a video abstract in a couple of days on the Valtech website. I would like to thank all the participants that took time to come after the standard office hours. They've been greatly rewarded with pizzas :)
Eric has already posted a summary of the evening (in french).

It was the first time I conducted the second part of the workshop, as I'm already conducting the first part in the TDR courses. This session uncovered lots of interesting points.

About the format and organization first. It lasted approximately 135 mins despite my efforts to shorten some phases, so it is still 45 mins too long. I can to optimize time by writing guiding sheets for the participants so I don't need to explain too much what they're exepected to do, that will also make the instructions clearer. I will also reduce the number of wooden blocks so the first part of the workshop can fit within the expected 30 mins.

Regarding the content I was surprised how unnatural it seems to use tests for exploring needs and formalizing requirements. The participants did use examples in their discussions, but it seems they didn't realize that these examples were already test embryos, so they took note of generic software requirements made after the examples, and then wrote tests from those requirements... Well, exactly the opposite stuff I wanted to see... So I encouraged the groups to work on concrete examples, and start writing tests from those examples.


Then going from examples to concrete tests seemed another uneasy step. I observed that many of the participants proposed to write tables (FIT habits ?) where I felt writing text and progressively maturing a DSL would make more sense and be more efficient for the exercise. As a result, participants were spending lots of time trying to put everything in a couple of tables and thus looking for suitable formats rather than simply writing tests as examples unfold. I also saw the teams progressivelly overwhelmed by the complexity of the requirements and losing track of the basic examples they had discussed at the very beginning of the exercise. They did not want to write tests until they get a sufficient idea of the requirements and they understood their complexity.


This workshop tought me that using tests for exploring needs and specifying software isn't that obvious. We should definitely think of some communication techniques so that teams get some guidance in this field, something like the TDD process applied to requirements (well, we've come a long way, it is the idea of test-driven requirements after all...). TDD does not allow you to write bunch of tests at once and then produce the code, it tells you to do one thing at a time, so you're not overwhelmed by the complexity. What could be the process for TDR ? What about: 1. get an example, 2. write a test, 3. get agreement on that test. We need to think about that and I'm really eager to conduct the workshop at Agile 2008 to check the output.

May 15, 2008

Bent Jensen presented Lean to Valtech consultants

Yesterday evening we had a presentation of Lean Software Development by Bent Jensen at our Valtech premises in Paris. Bent is helping Elisabeth, a colleague from Valtech, in a mission where our client wants to apply Lean techniques to reduce the cost of their product. Bent has spent a couple of days in Paris and we took this occasion to organize this presentation in the context of our weekly evening training agenda.

Bent made a clear and concise presentation of what Lean is and where does it come from, as well as he explained what Lean actually means in the software development field. After the presentation, Elisabeth talked about their ongoing mission and what they achieved so far, illustrating her explanations with pictures from battlefield.

I really appreciated to hear and discuss application of Lean in software development field. After reading the reference book The Machine That Changed The World, that gave me much more concrete examples and analogies in our field. The point of the presentation that I especially recall is when Bent talked about so-called exploration phase (first iterations) and construction phase (after some iterations) in the lean development process he sketched on one of his slide. He pointed out that this is the way Toyota does in their product development process. Lots of talks are going on in the Agile world whether running an "iteration 0" to settle down some architectural components or frameworks. I think he had an interesting point for this debate. To some extent, what Bent presented recalls a bit the RUP (Rational Unified Process) with its elaboration and construction phases (but no idea here of inception and transition).

Bent's company has a blog where he posts news. I found there some of the stories he told during his talk, like the Lego bug tracking system or his trip to Japan.

May 14, 2008

(Feed)Back from XP Day France 2008

I attended the XP Day France 2008 last week, and also gave a talk on the second day. The conference was a success as they recorded more participants than last year. From my point of view I participated in sessions much more interesting than I expected, and especially enjoyed some enriching workshops.

In the morning of day 1, I attended the workshop on Lean System. That was a role-playing game in which a team had to build 10 paper-shadocks in an assembly line-like mode during rounds of 10 mins. After each round the customer (the workshop leader) would accept or reject the finished units depending on what she found acceptable. The project manager would then calculate the financial results based on the sells (accepted shadocks), consumed commodity (paper sheets), team-member wages, facility (tables), etc. I played the shareholder, not a very demanding role as I just had to sit and wait for the financial results, then look very disappointed as the results were quite bad until the last rounds. Before beginning a new round, the team could only take 1 action to improve the output and increase the overall results. Beside being fun, the workshop actually tought some interesting lessons while it was not really focused on software development:

  • first of all, I realized that Lean System is not mainly about removing waste, it is firstly about removing variability among instances of the same steps. This point, though, might not make much sense in software development and this is probably why we talk much more about removing waste in lean software development,
  • applying lean system may drive to reduce human resources while increasing productivity and quality. I thought only productivity and quality was the focus while keeping stable resources. The workshop leader therefore stressed that initiating a lean system within an organisation might fail badly if it does not happen in an economically favorable context, because employees will not contribute to a system that might lead to removing their job.

In the afternoon of day 1, I attended the Lightning Talks led by my colleague Yannick. I also made a try at the XP laboratory but soon left the session as it turned too technical to me. What I experienced there was indeed very weird: at the beginning of the session, the organizers explained briefly what was realized thus far and what was the stories they proposed to make in the first iteration of the session. They then asked whether the participants were comitted to delivering these stories by the end of the 10 mins iteration. And guess what, nobody said no ! None of us had a clear idea of the existing design, code quality, or past problems, but we all comitted to deliver the stories. Even more striking is the fact that everyone but me then rushed to the available laptop PC to dig in the code and start programming. Well, I left the session soon after as I'm quite unable to undertsand or program a line of JAVA. I did not feel like that was a great XP application. I however had the occasion to discuss this brief experience with the workshop oragnizers, François Beauregard and Eric Mignot from Pyxis, and it confirmed what I felt: the organizers let the team made some mistakes so that they could learn the XP values. They also were very surprised that no one, whatever the sessions, refused the comittment on the proposed stories having no idea of the design and problems. This is even more striking knowing that this laboratory did not have any financial stakes and that most of the participants are not beginners in the Agile world. That makes me think the way is still long until we master the Agile values and apply them naturally.

Day 1 ended up with a good dinner. I had the occasion to discuss extensively with an employee of Parkeon, especially on their use of FitNesse. I now have better examples to share if I encounter again the objection that FitNesse and TDR techniques are not suited to embeded software (objections that I often hear when I'm teaching the TDR course).

On day 2 I opened the day with my talk on Test-driven Requirements, while 2 of my colleagues also made a presentation (Romain on continuous integration and Nathalie on contracts for agile projects).

I then attended a very enriching workshop about Leadership. The principle was simple, the participants had to build structures with Lego blocks. We run three such sessions, each with a different type of leadership. The first session was driven by 2 leaders with a commanding type of management, the second had no leader and the team built a church in a self-organizing fashion, and the third had 2 coaching leaders (I was one). A global restrospective at the end gave us the opportunity to discuss the characteristics of a good leader and what they missed to do in the exercise. Beyond the lessons that this workshop was supposed to bring about, I realized something more after the retrospective: when there is a leader, everyone holds him responsible for the failure or success of the exercise, on the other hand, when there is no leader, everyone tries to identify what was good and bad with herself or himself. In brief, we are naturally feeling responsible when there is no designated leader while we naturally shift all the responsibility on the leader's shoulders when there is one. What this workshop tells to me is: if you're a leader, expectations will be high on you whatever the type of management you're applying, so you'd better succeed :)

Finally I attended a presentation on practicing the conflict. The presenters made their talk dressed in kimonos and played small funny scenes to illustrate their points. Briefly speaking it was about using an aikido-based technique to master a conflict and not to avoid it. Beyond these techniques, they stated that conflict is a mandatory step towards successful teams, argueing that conflicts can bring about better solutions. Although I understand and share some of their points, I still cannot consider that the conflict is a must have for successful teams. I rather think that confrontation is a must. Sometimes confrontations turn into conflict so we indeed need techniques to master conflicts but I don't see them as a healthy step toward sucess.

Before leaving, I spent some time discussing with the Pyxis guys and getting a demo of Greenpepper (great tool!).

That was my experience of the XP Day 2008 !

May 13, 2008

Agile 2008

My session proposal for Agile2008, the world conference about agile software development, has been officially selected ! This is a great news and means I will be flying to Toronto beginning of August to attend this exciting event. I'm scheduled tuesday the 5th at 10.45 AM, I will therefore have the pleasure to kick off the Conference sessions just after the keynote (monday is reserved to research topics and icebreaker). I now have to prepare the details of my session and make sure I can get the material I need to run the workshop smoothly and make it a successful experience for the participants. I will be posting some updates on my preparation as it comes.

April 29, 2008

Test-Driven Requirements versus functional test automation

I am updating the case study of the TDR course that I recently developed for Valtech Training. During the first sessions of the course, it turned out that the trainees had a difficult time understanding the differences between Test-Driven Requirements and GUI Functional Tests. So I decided to improve the case study with real examples of automated GUI functional tests.

The case study is a simple webapp that is supposed to be an online banking system (it is of course greatly simplified). I am the product owner of this app, and one Valtech consultant developed it for me (because I can hardly code something now). I built the functional specifications with FitNesse, using test tables to interact with the developer and discuss the functional requirements. In turn, he used the fixtures in a TDD manner to drive his development. Below is an example of the functional requirement that describes an immediate money transfer :

As you can see, we mixed textual descriptions in a use-case style with an example to illustrate the description. The example is actually a test, specifying preconditions, actions, and verifications. We used a RowEntry fixture to specify the preconditions, i.e. to populate the database with test data, then we used an Action fixture to describe the actions, and a Row fixture to verify the results. The example is in french. If you cannot understand french, you just need to know that we spent some time to choose appropriate fixture names, so that the test tables do not feel like automated tests or freaky technical stuff inserted in plain text specifications. We really wanted that the test tables read like real examples, so we chose fixture names like "There exists the following accounts", or "Consult account details", which read more naturally than freaky names we often see with FIT/FitNesse examples "com.myapp.test.fixtures.ConsultAccount". We also put some effort to mature a domain specific testing language (DSTL), so that we could reuse the expressions for precondition or verification with different requirements. For example, we used a lot the fixture "There exists the following accounts" to set up initial test data.

Now let's move to the GUI testing. I made use of a Selenium-FitNesse bridge (the like of webtest) to integrate my automated GUI tests with my specifications. That bridge provides a testing mini-language to allow non-technical people like me writing automated test cases without knowing Selenium scripting. I created a couple of cases to test the webapp, below is an example of an automated test case that traces to the specification given above:



In this example, we see that the flow of actions and verifications is overwhelmed with interface details and strategic instructions (wait for page to reload, etc). The testing mini-language gives a better readability than the equivalent Selenium script would do (some test specialists like to call this "keyword-driven testing"), but still, the test case is not good at serving the specification side. Both types of tests are complementary and should co-exist. The first FIT-like type serves to drive and support the specification process, it also helps in building a regression safety net; the second Selenium-like type serves to test the application and make sure every piece of the software properly fits together. In the end, we should get something like this:

The screenshot above shows the organisation of my FitNesse wiki for the webapp. I created a topmost suite that contains two "sub-suites": one is the specification artefact, containing the FIT-like tests, one is the acceptance test artefact, containing the Selenium-like tests. I made use of links to trace tests from the second artefact to the first, so that I can have an idea of impacted GUI tests when I change the specifications. I can run seperately the specification tests and the acceptance tests, or within a single click. It should be clearer now that TDR and GUI fonctional testing serves a different purpose. I used FitNesse and Selenium because they are open-source, but the same kind of approach can be followed using test management proprietary tools like QualityCenter in combination with test robots and requirements management tools. It's not going to be the same price though.

February 11, 2008

Next generation testing tools

During an internal Open Space Technology in Valtech about one year ago (12th Feb 07) we had a discussion on software testing frameworks and test automation tools. I initiated this discussion by proposing a topic about how to make a framework that would hide the programming details of functional test automation and help focusing on writing test cases. We exchanged about existing commercial tools and their limits. We also discussed about frameworks, or wrappers, built on top of those tools to allow for better test creation. Valtech India has built such a wrapper, named LoFat, based on Excel spreadsheets and that can use QuickTestProfessional, Selenium, or SilkTest to execute the functional tests. My current client has also built such a wrapper on top of QuickTestProfessional. We finished the openspace discussion by sketching a simple and theorical architecture to support our approach. After this Open Space, we did not keep this discussion rolling.

Some time ago I came across an article from Jenitta Andrea titled Envisionning the Next Generation of Functional Testing Tools. That's a big piece of work that depicts our expectations towards test automation tools in a bright and comprehensive way. Following this article, Jenitta organised a workshop for the Agile Alliance in October 2007, gathering test automation tools experts to push the discussion further on. There are many reports of this workshop on participants' blogs, and a digest on InfoQ. A discussion group has been created on Yahoo to keep the discussion rolling. I think the consultants that participated in our open space discussion a year ago should subscribe to this group, because there are really interesting stuffs going on. As an example, an interesting web seminars is planned 19th of February, featuring Jenitta and Ward Cuningam, about next generation functional testing tools. Ward might present a new tool from Thoughtworks that supports the approach described by Jenitta. Well, by reading the description of the seminar, there is no question that Test-Driven Requirements and Functional Testing are now one single issue.

January 3, 2008

The (Toyota) way to go

Toyota has become the biggest car producer in 2007 overtaking General Motors who had been holding this position for 80 years. This is quite a big news! Toyota has been paving the way to efficient production management for decades, inventing lean and other manufacturing management concepts. Lean can be applied everywere and be transposed to many other businesses. Lots of companies look in this field to improve their process throughput while keeping stable resources.

In software development field, the definitive reference for lean thinking is the work of the Poppendiecks, which is gathered on their website. Many agile development models and practices are either inspired by lean thinking or verify lean principles:
  • TDD implements a perfect "pull" organisation of the code and unit tests tasks
  • eXtreme Programming has many practices verifying lean principles (Jacques Couvreur wrote a good article on how XP relates to the Toyota way),
  • Continuous Integration and software factories are a way to make jidoka happen in the software development process
  • Test-Driven Requirements (TDR) is an extension of TDD, applying lean principles to the other steps of the software development process (I made a presentation, in French, on this topic during the Valtech Days 2007)
  • and so on...

Although it seems Agile Values were not directly inspired by Lean Thinking by the time of the Agile Manifesto, there are so many connections today in the software development field that it is difficult to differentiate either one. However, I think that Lean covers a broader area of interest. Software development organisations that seek to be "agile" today will probably seek to be "lean" in the coming years.

December 12, 2007

Continuous integration and functional tests

I just wrote an article for my current client, an investment bank, to explain to project managers what it takes to perform non-regression functional tests in a continuous integration process. When I say non-regression functional tests, I mean real functional tests that stimulates the GUI to perform tests. My main point is to argue that continuous integration is not exactly like continuous testing and that their respective objectives cannot be combined as easily as just integrating functional regression tests into the continuous integration process. This is actually what the managers ask for when we start automating the functional tests: "can we integrate these tests into the continuous integration process ?". Well, I could merely answer yes, for it is technically possible, but that answer would not be fair because they cannot integrate the functional tests into their specific continuous integration process. Let me explain this.

Usually the projects we are working with have a continuous integration process including the following steps: compile, build, code check (this step is usually referred to as "test" but I don't like this name), and (sometimes) package. So they are doing continuous integration and they are continuously checking the code, period. Problem is that they want to do continuous testing, I mean not only basic unit tests, they want to continuously perform functional non-regression tests on every build. And this is another story, because it usually requires a fully operational testing environment to test the application at runtime (which is needed for "real" functional tests). Between the time a build is ready for testing and the time it can actually be functionnaly tested, lots of operations are conducted manually, like deployment tasks, database and plateform setup, production data copy, startup of shared services, etc. Some managers don't know that they must fully automate the deployment in testing environment if they want to integrate fonctional regression tests in the continuous integration process. So first point was to stress on fully automating the deployment process.

But this is not the major issue to be addressed. Another point that I stressed was about not breaking the continuous integration benefits. Continous integration is based on quick feedback to the developers: the quicker the feedback, the better the fix. Anything that increases the delay between the time a developer checks his code in the source repository and the the time he gets notified of a failed build jeopardizes continuous integration success. Thing is that functional regression tests is time consuming, and comes after deployment (as noted above) which can be even more time consuming. So it does not look like a good option to automatically perform functional regression tests in the same workflow as continuous integration. Besides this, it does not make sense to functionally regression test every build

The solution that I advised is to create workflows that are triggered subsequently to provide some sort of increasing verification level while not breaking continuous integration. The target organization is to split the job into at least 2 workflows, but that can be more:


  • the classical continuous integration workflow (compile, build) that is triggered as soon as a new piece of code is introduced in the repository. It is usually extended to check code (unit test, code rules, ...), but I advise to do so as long as it does not exceed 10-20 mins.
  • It is better to add one more workflow to the process between continuous integration and regression workflow so that tasks like package, generate documentation, install, or even code verification could be removed from the continuous integration workflow and be included in this "intermediate" workflow. It should be triggered as soon as a successful continuous integration workflow has been performed (successful build).
  • a regression workflow that includes deploying and regression testing (functionnaly) on the test plateform and is triggered at most once a day (say at night), most probably once a week depending on the development cycles and team size.
In short, my answer to the question was:
- fully automate the deployment process
- seperate functional regression tests in another workflow than the classical continuous integration workflow and trigger it at most once a
day

After releasing this article to my client, I started thinking that the name "continuous integration process" as used in the software programming field today is really a bad name. But this is another story.

December 4, 2007

First post

This is a simple, yet meaningful title for this post, so that no one will expect too much out of it.

I finally created this technical blog, at least more technical than my personal blog, as an answer to Eric's prayers. I intend to drop here my thoughts and readings on subjects that are of some interest to me (well, this is the point of a blog after all), actually everything that relates more or less closely to software testing, namely:
- functional test automation
- test-driven requirements
- quality management
- continuous testing
- ... and many more topics that I don't even know yet