sss ssss      rrrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr
         +=======    Quality Techniques Newsletter    =======+
         +=======            January 2004             =======+

subscribers worldwide to support the Software Research, Inc. (SR),
eValid, and TestWorks user communities and to other interested
parties to provide information of general use to the worldwide
internet and software quality and testing community.

Permission to copy and/or re-distribute is granted, and secondary
circulation is encouraged by recipients of QTN, provided that the
entire document/file is kept intact and this complete copyright
notice appears in all copies.  Information on how to subscribe or
unsubscribe is at the end of this issue.  (c) Copyright 2003 by
Software Research, Inc.


                       Contents of This Issue

   o  eValid FT-150 Top Page Download Time Survey

   o  eValid: A Quick Summary

   o  SQRL Report Abstract

   o  About the Quality Week Conferences: 1998 - 2002

   o  AmiBug Course Schedule

   o  ISSRE2004: Ste-Malo, France, November 2004

   o  Web Services and Agent-Based Engineering (WSABE)

   o  eValid: A Quick Summary

   o  Fourth International Conference on Quality Software (QSIC

   o  QTN Article Submittal, Subscription Information


            eValid: FT-150 Top Page Download Time Survey

How does WebSite quality track with company size and reputation?
How well do big companies WebSites compare?  Can we learn something
by studying the technical properties of these big company's

In an earlier in-depth study we detailed some partial answers to
this question; see:


But what about a web surfer's very, very initial experience?  What
you might call the "speed of first encounter" experience.  What
happens when someone goes to the very topmost page of a WebSite?
For just these pages alone, how do the FT-150 sites compare?

                         Experimental Setup

To find out we used eValid's record/play capability to record total
page download times.

The playback script downloads the top page of the each of the 150+
sites with eValid set up to run with an empty cache, exactly the way
you would download the page the very first time you navigate to that
page.  The full page timings are reported in the Performance Log.

We ran the test script on fast DSL connections.  From our office
here in San Francisco, we ran the test 10 times and averaged the
data.  We ran the same test from Eastern Canada, from France, and
from South Africa, all with similar speed web connections.  each
page, To minimize the effect of web latency all the test downloads
were run in quick succession. As a result, the performance data
differences we observed are due to primarily to variations between
the particular WebSite's server capabilities and not on web latency
or "last mile" factors.


Here is a sampling of the results we found.  Some were expected;
some were amazing.

  o The achieved download rate varied by nearly 16:1.  Because the
    rates never exceeded 40% of available DSL capacity, and the
    tests were run so quickly on after the other, we know that the
    wide variation we observed is due primarily to server

  o The top page size on the 150+ WebSites varied from 17 KB to over
    575 KB, a 32:1 ratio.  Amazingly, some of the larger pages
    actually were downloaded and rendered quicker because the
    servers were faster so much faster.

  o The average top page took ~5.9 seconds to download.  This is
    well over the "the 3-second click-away rule" -- the guideline
    that is generally accepted as a worst-case response criteria.
    Amazingly, some pages took as long as 65 seconds.

The conclusion is that the biggest companies often don't pay as much
attention as you'd think to maximizing the "speed of first
encounter" -- probably to their detriment.  We think they could do a
lot better.

You can see the complete results -- including the full set of
download timing data -- at:


Please let us hear from you if you have any questions or comments!


                         SQRL Report No. 19

            Inspecting the Source Code that Implements
                     the PPP Protocol in Linux

                           Srdjan Rusovan

Abstract:  The Point-to-Point Protocol (PPP) is a widely accepted
standard used by almost everyone who connects to a server over a
telephone line. An open source implementation of the PPP protocol,
pppd - the PPP daemon, is included in various Linux distributions
and widely used by Linux users. Version 2.4.1 of the pppd is typical
of many software products in that the only documentation available
is the C programming language code and a high-level requirements
document containing an English language description of the system's
required behaviour, together with the state transition table for the
protocol's abstract state machine.

In order to perform a rigorous inspection of the pppd protocol state
machine code based upon Parnas' Display method, we required detailed
design documents that simply did not exist.  In [Parnas 1994] a
rigorous inspection method which manually generates the absent
design documentation and performs the inspection, is presented.
This thesis shows how parts of the process can be automated using
theorem proving techniques. Specifically, the reverse engineering of
the design documentation was primarily done in PVS Specification and
Verification System.  The resulting combination of inspection based
upon the Display method and theorem proving was used to find several
mistakes in the PPP implementation code.  Inconsistency in coding
style and significant issues regarding the readability of the code
were also discovered.

Thus the thesis also provides a detailed example of successful
application of the Displays inspection method to a non-trivial

The web address for downloading reports is:


         About the Quality Week Conferences -- 1998 - 2002

Background:  The QW and QWE Conference events provided a valuable
service to the worldwide software quality community for many years.

Some Statistics:  We organized 15 Quality Week Conferences (held
annually from 1987 through 2002) and 5 Quality Week Europe
Conferences (held annually from 1998 through 2002). There have been
8 conferences since 1998 (one QWE was delayed because of 9/11).

The Volunteer QW/QWE Advisory Boards for our conferences involved
some ~275 people, many of whom served for multiple years and for
multiple Conferences. They were chosen for their expertise and their
ability to guide the technical content of the event, and they did a
great job! They are the main reason QW/QWE papers always got high
marks for technical content, and why so many QW/QWE papers are cited
in the technical literature.

Overall the twenty QW/QWE Conferences presented some 115 Keynote
Talks by top-notch industry leaders. The Conferences included 35
tutorial days, involving nearly 200 different tutorial speakers. The
main Conference tracks involved 800+ regular technical

The total attendance at all QW/QWE events was somewhat over 9,000
people, including over 7,000 in the last 5 years of the event. Over
the years, some ~275 different companies exhibited their products
and services at the Quality Week Expo.

Firsts & Kudos:  The QW/QWE conferences featured many industry-first
announcements and presentations. Our unique approach of strict
separation of vendor and commercial interests from the technical
conference content was a key measure of the success. The QW/QWE
conferences spotted future trends and presented "first ever" topics
on a range of new technologies, new approaches to software quality,
and new initiatives in the industry.

The unique blend of research papers from an academic or R&D
perspective was combined very effectively with practical solution
papers usually derived from real-world experience. QW/QWE presented
a unique platform -- free from the biases of commercial
exploitation, and in a spirit of free and open exchange of ideas.

Moreover, the QW/QWE forum provided many now-well-known speakers
with their very first opportunities -- based solely on the merit of
their content not on the size of their employers' budget.  Many
QW/QWE speakers/presenters have made their fortunes in the software
quality arena in part because of the exposure they received at

As one of our Advisory Board Members commented, "QW/QWE provided a
professional, peer-reviewed conference that was otherwise missing."

The Future:  The main focus of QW/QWE conferences was in processes
and technology used to achieve quality in client-server software
development projects and applications. These technologies have
advanced so far that by 2004 it is clear that the need for enhanced
quality and new technologies for achieving quality in this area of
activity has diminished. In effect, the QW/QWE core objective of
supporting achieving client-server software quality has been

It's time for QW/QWE to think about "what's next?"

What is the single ever-present reality where quality -- and
possible lack of it -- has a over-reaching impact? Just as in the
late 1990's through 2002 it was client-server software development,
in the decade to come it is self-evident that the biggest need is
for "Quality" will be in the internet. That's the area we'll be
concentrating on in future Quality Week Conferences.

For your convenience we've prepared a comprehensive index of QW/QWE
papers from 1998 - 2002 at:



                       AmiBug Course Schedule

AmiBug private on-site workshops:

                     Public training in Ottawa:

Practical Hands on Testing:

Software Testing Methods and Tools:

Testing Web and E-Applications:

Fundementals of IT Project Management:

                    Public training in Montreal:




               November 2-5, 2004, Saint-Malo, FRANCE

This event includes: Technical paper tracks, Tutorials, Industry
best practice presentations, Workshops, and more!

ISSRE focuses on the theory and practice of Software Reliability
Engineering.  The conference scope includes techniques and practices
to (1) verify and validate software, (2) estimate and predict its
dependability, and (3) make it more tolerant/robust to faults. Over
the years, the conference has grown steadily attracting about 200
participants on a regular basis. The conference is big enough to
represent all the key topics in software reliability engineering,
but small enough to provide an in-depth representation of theory or
practice in these areas. Industry participation has also increased
over time, leading to a healthy mixture of theory and practice. This
year's theme is on the use of model-driven software development and
its implications on soft ware dependability.  ISSRE 2004 will be
held in Rennes (Nov. 2) and Saint-Malo (Nov. 3-5). Both towns are
medieval cities. Saint-Malo is a corsair (a corsair is a kind of
official "pirate," hired by the king) city located on the north
coast of Brittany, close to the Mont Saint-Michel and to Rennes.
Every street corner, every shore recalls the city's legendary past,
along the innumerable walks and seafront promenades. Close to
Saint-Malo, the Mont Saint-Michel is one of Brittany's best-known

                           General Chair

Yves Le Traon, University of Rennes I; IRISA research institute,

                           Program Chairs

Lionel Briand, Department of Systems and Computer Engineering,
Carleton University, Canada and Software Quality Engineering Lab

Jeffrey Voas, Cigital, Inc., USA

QTN is E-mailed around the middle of each month to over 10,000
subscribers worldwide.  To have your event listed in an upcoming
issue E-mail a complete description and full details of your Call
for Papers or Call for Participation to .

QTN's submittal policy is:

o Submission deadlines indicated in "Calls for Papers" should
  provide at least a 1-month lead time from the QTN issue date.  For
  example, submission deadlines for "Calls for Papers" in the March
  issue of QTN On-Line should be for April and beyond.
o Length of submitted non-calendar items should not exceed 350 lines
  (about four pages).  Longer articles are OK but may be serialized.
o Length of submitted calendar items should not exceed 60 lines.
o Publication of submitted items is determined by Software Research,
  Inc., and may be edited for style and content as necessary.

DISCLAIMER:  Articles and items appearing in QTN represent the
opinions of their authors or submitters; QTN disclaims any
responsibility for their content.

TRADEMARKS:  eValid, HealthCheck, eValidation, TestWorks, STW,
STW/Regression, STW/Coverage, STW/Advisor, TCAT, and the SR, eValid,
and TestWorks logo are trademarks or registered trademarks of
Software Research, Inc. All other systems are either trademarks or
registered trademarks of their respective companies.

        -------->>> QTN SUBSCRIPTION INFORMATION <<<--------

To SUBSCRIBE to QTN, to UNSUBSCRIBE a current subscription, to
CHANGE an address (an UNSUBSCRIBE and a SUBSCRIBE combined) please
use the convenient Subscribe/Unsubscribe facility at:


               Software Research, Inc.
               1663 Mission Street, Suite 400
               San Francisco, CA  94103  USA

               Phone:     +1 (415) 861-2800
               Toll Free: +1 (800) 942-SOFT (USA Only)
               FAX:       +1 (415) 861-9801
               Web:       <>