sss ssss      rrrrrrrrrrr
                      ssss    ss       rrrr   rrrr
                     sssss     s       rrrr    rrrr
                     ssssss            rrrr    rrrr
                      ssssssss         rrrr   rrrr
                          ssssss       rrrrrrrrr
                    s      ssssss      rrrr  rrrr
                    ss      sssss      rrrr   rrrr
                    sss    sssss       rrrr    rrrr
                    s  sssssss        rrrrr     rrrrr
         +=======    Quality Techniques Newsletter    =======+
         +=======               May 2005              =======+

subscribers worldwide to support the Software Research, Inc. (SR),
eValid, and TestWorks user communities and to other interested
parties to provide information of general use to the worldwide
internet and software quality and testing community.

Permission to copy and/or re-distribute is granted, and secondary
circulation is encouraged, provided that the entire QTN
document/file is kept intact and this complete copyright notice
appears in all copies.  Information on how to subscribe or
unsubscribe is at the end of this issue.  (c) Copyright 2004 by
Software Research, Inc.


                       Contents of This Issue

   o  eValid: Latest News, News, New Features, Updates

   o  21st Annual UK Performance Engineering Workshop

   o  Security Measurements and Metrics: Quality of Protection 2005

   o  2nd International Workshop on Hypermedia and Web Engineering

   o  eValid: Usage Recommendations

   o  Ten Critical QA Traps

   o  Web Engineering Journal

   o  Innovative Testing and Feasible V&V of Software Systems

   o  Visualizing Software for Understanding and Analysis

   o  QTN Article Submittal, Subscription Information


             eValid: Latest News, New Features, Updates

eValid is the premier WebSite Quality Testing & Analysis Suite.
eValid solutions help organizations maintain e-Business presence,
improve WebSite quality and performance, reduce down time, prevent
customer loss, and control costs.

eValid's Web Analysis and Testing Suite is comprehensive, yet
scalable and easy to use, and applies to a wide range of web
applications.  Because eValid is implemented inside an IE-equivalent
browser you are guaranteed to get 100% realistic user experience

              Support for ASPs with Commercial License
License restrictions often limit how a consultant or a contractor --
or Application Service Providers (ASPs) firm -- can use eValid and
deliver the results to clients.  eValid's licensing now includes a
new option for ASPs that will simplify life, so both you and your
clients can benefit from eValid technology:

                    Ramping Up of LoadTest Runs
In server loading experiments a main goals often is to study how the
server complex responds to activity load that "steps up" at regular,
pre-programmed intervals.  Here's how to do this in eValid LoadTest

                     Playback Startup Sequence
To minimize playback de-synchronization as much as possible, the
latest eValid builds have a new and more-powerful recording startup
sequence.  The new startup sequence helps you manage disk cache and
cookie processing more reliably:

                      LoadTest Scenario Editor
eValid now includes a scenario editor that makes the job of setting
up a LoadTest simple.  It's got:

  * Ability to create a server loading scenario that focuses
    attention on how users and user types are allocated.

  * Ability edit and re-edit existing or new loadtest scenarios.

  * Capability to automatically generate the underlying *evl page.

Complete details on the scenario editor can be found at:

                      HTTP Detailed Reporting
An enhanced capability for monitoring detailed HTTP download times
and download errors has been added to the eValid playback engine.
Users can select to have HTTP errors reported as WARNINGs or ERRORs.
In addition, detailed timing logs generated by eValid now include
the specific byte size and download time of each page component

For complete details see:

                     Product Download  Details
Here is the URL for downloading eValid if you want to start [or re-
start] your evaluation:

                   Contact Us With Your Questions
We welcome your questions about eValid and its applications.  We
promise a response to every question in ONE BUSINESS DAY if you use
the WebSite request form:


          21st Annual UK Performance Engineering Workshop
                          July 14-15 2005
                 University of Newcastle-upon-Tyne

UKPEW is the leading UK forum for the presentation of all aspects of
performance modeling and analysis of computer and telecommunication
systems.  Original papers are invited on all relevant topics but
papers on or related to the subjects listed below are particularly
- Stochastic modeling of computer systems and networks.
- Tools for all aspects of performance engineering from workload
characterization to operational analysis.
- Modeling paradigms including, but not limited to: Stochastic
Process Algebras,
- Stochastic Petri Nets and Queueing Theory.
- Solution and simplification techniques for large Markov processes.
- Formal notations for performance modeling.
- Modeling and analysis of reliability and performance guarantees.
- Performance-oriented design methods.
- Novel techniques in computer system simulation.
- Case studies and industrial reports.
- Experience and evaluation of existing tools and techniques.

This workshop is the primary event at which the UK performance
community can get together to present and discuss their work.  While
primarily aimed at UK based performance engineers, we also welcome
contributions from researchers working in other countries and those
whose interest in performance is derived from experience in other
fields.  It provides an ideal opportunity for academics,
industrialists and PhD students to relate experience and to present
both finished work and work in progress.  The event has been
organized to sustain the atmosphere of informality that previous
UKPEW's have fostered.  As well as sessions for presenting the
papers submitted, it will also be possible to organize a session of
short presentations of work in progress.

All enquiries should be directed to


                 Security Measurements and Metrics:
                  QUALITY of PROTECTION - QoP 2005

               Milano, Italy, Thu. 15 September 2005.

                         WORKSHOP OVERVIEW

Information Security in Industry has matured in the last few
decades.  Standards such as ISO17799, the Common Criteria, a  number
of industrial  certification and risk analysis methodologies have
raised the bar on what is considered a good security solution from a
business perspective.

Yet, if we compare Information Security with Networking or Empirical
Software Engineering we find a major difference. Networking research
has introduced concepts such as Quality of Service and Service Level
Agreements. Conferences and Journals are frequently devoted to
performance evaluation, QoS and SLAs. Empirical Software Engineering
has made similar advances. Notions such as software metrics and
measurements are well established. Processes to measure the quality
and reliability of software exist and are appreciated in industry.

Security looks different. Even a fairly sophisticated standard such
as ISO17799 has an intrinsically qualitative nature. Notions such as
Security Metrics, Quality of Protection (QoP) or Protection Level
Agreement (PLA) have surfaced in the literature but still have a
qualitative flavour. The "QoP field" in WS-Security is just a data
field to specify a cryptographic algorithm. Indeed, neither ISO17799
nor ISO15408 (the Common Criteria)  addresses QoP sufficiently.
ISO17799 is a  management standard, not directly concerned with the
actual quality of protection achieved; ISO15408 is instead a product
assessment standard and yet does not answer  the question of how a
user of a product assessed by it can achieve a high QoP within
his/her operational environment. Both standards cover just one
aspect of an effective QoP and even the combination of both would
not address the aspect sufficiently. "Best practice" standards, such
as the baseline protection standard published by many government
agencies, also belong to the category of standards that are useful,
but not sufficient, for achieving a good QoP.

Security is different also in another respect. A very large
proportion of recorded security incidents has a non-IT cause. Hence,
while the networking and software communities may concentrate on
technical features (networks and software), security requires a much
wider notion of "system", including users, work processes,
organizational structures in addition to the IT infrastructure.

The QoP Workshop intends to discuss how security research can
progress towards a notion of Quality of Protection in Security
comparable to the notion of Quality of Service in Networking,
Software Reliability, or Software Measurements and Metrics in
Empirical Software Engineering.

SUBMISSION TOPICS:  Original submissions are solicited from industry
and academic experts to presents their work, plans and views related
to Quality of Protection. The topics of interest include but are not
limited to:

* Industrial Experience
* Security Risk Analysis
* Security Quality Assurance
* Measurement-based decision making and risk management
* Empirical assessment of security architectures and solutions
* Mining data from attacks and vulnerabilities repositories
* Security metrics
* Measurement theory and formal theories of security metrics
* Security measurement and monitoring,
* Experimental verification and validation of models,
* Simulation and statistical analysis, stochastic modeling
* Reliability analysis


                   2nd International Workshop on
                  Hypermedia and  Web Engineering

                        in conjunction with
         ACM Hypertext 2005, Salzburg, September 6-9, 2005

The goal of this workshop is to bring together researchers and
developers from academia and industry to exchange ideas about the
problems they are facing during current Web / Hypermedia Projects,
and to discuss good practices and successful case studies (concrete
examples, designs and applications), as well as recent and
innovative results in Web and Hypermedia Engineering.

Web and Hypermedia engineering have provided several important
research and practical results especially during the last decade.
However, very few of them have transferred to real-life projects.
Engineers are unable to study all these results, since it is a time
consuming task, difficult to be accomplished in the strict timeline
of a project. As a consequence, web & hypermedia research results
are not used adequately (if at all) during the development of
current Hypermedia / Web Information Systems. As more research
results come up every year, they constitute a very complex
information space that itself need to be engineered, in order to be
provided to developers in a meaningful and comprehensive way.

The main objective of this workshop is to survey, evaluate and
discuss the current practices, techniques and theory in the
engineering of modern Web / Hypermedia Information Systems. We
especially encouraging submissions that emphasize the "work" aspect
based on concrete examples and designs, to facilitate discussions
among participants. The expected outcome of the workshop would be a
set of recommendation for methodologists, designers, and
applications developers regarding the main technical problems in
developing Hypermedia and Web Applications.

                         TOPICS OF INTEREST

- project management: process and product models
- systems architecture and modelling
- application development
- metrics (quality assurance, performance)
- technologies and tools survey & evaluation
- design patterns and good practices
- integration (methodologies, practices and tools)

                        WORKSHOP ORGANISERS

                     Dr. Sotiris Christodoulou
             High Performance Information Systems Lab.
             Computer Engineering and Informatics Dept.
                    University Of Patras, Greece
                        Phone: +302610993805
                         Fax: +302610997706

                         Dr. Michail Vaitis
                      Department of Geography
                      University of the Aegean
                     GR-811 00 Mytilene, Greece
                       Phone: +30 22510 36433
                        Fax: +30 22510 36409


               eValid -- Some General Recommendations

Here are common eValid problem areas and references to pages that
provide general good-practice recommendations.

* Functional Testing Recording and playing scripts, with validation,
    is a sure way to confirm operation of a web site or web

  o Protecting Login Account Names and Passwords

    If you are recording logging into a site, eValid will need to
    make a record of your account name and password.  For the best
    security, you should record login and password details in
    encrypted form.  There's an option in the Script Window Dialog
    to turn on the Encoded Input option that protects critical
    private information.  See:>

  o Initial State

    Being a fully stateful recording and playback engine, eValid is
    very sensitive to the initial state when playback begins.  Here
    are some recommendations about to manage your test's Initial
    State effectively.  See:

  o Session Cookies

    Session cookies are remembered inside eValid and the surest way
    to clear them is to close eValid and launch it again.

  o Modal Dialogs/Logins

    Because of the nature of modal dialogs you may not be able to
    use them directly.  Instead, eValid provides a way to construct
    a reliable script by creating the correct commands via the
    Script Window Dialog.  Check the documentation on modal dialog
    support and on testing modal logins:

  o Opaque Objects

    Certain objects are opaque relative to eValid's internal view of
    web page properties, and have to be treated differently.  These
    object types include Java Applets and FLASH objects, discussed

    In addition, it may be helpful to see how to use eValid's
    Application Mode:

* Server Loading eValid applies load to a server with multiple
    eValid browser instances.

  o Machine Adjustments

    If you want to get more than ~25 eValid copies running at on
    time you probably need to make Machine Adjustments to optimize
    your computer as a server loading engine.  See:

  o Ramping LoadTest Runs

    The most common form of application includes ramping up server
    load so you can study how the server performance degrades due to
    increasing load.

* Site Analysis eValid site analysis runs are a powerful way to
    confirm website properties.

  o Avoid Logout During Scan

    A common problem during a site analysis scan is that eValid logs
    you out before the scan is done!  This happens when you start
    the scan after logging into a protected area and the eValid
    search spider navigates you to the "logout" page.  The way to
    avoid this is to make sure that your Blocked URLs List includes
    "logout" and "signoff".  See:


                       Ten Critical QA Traps

Quality assurance isn't what it used to be. It isn't where or when
it used to be, either. Whether you're talking about applications for
internal use for software products for market, the development cycle
has changed considerably in recent years.

The following is a list of 10 common QA traps. Now, they may seem
like common sense, but in our almost twenty years of testing
experience, we have often seen great companies with great products
fall into these traps.

  1. Unclear ownership of product quality.

  2. No overall test program design or goals.

  3. Non-existent or ill-defined test plans and cases.

  4. Testing that focuses narrowly on functional cases.

  5. No ad hoc, stress or boundary testing.=20

  6. Use of inconsistent or incorrect testing methodology.

  7. Relying on inexperienced testers.

  8. Improper use of tools and automation, resulting in lost time
     and reduced ROI.

  9. No meaningful metrics for tracking bugs and driving quality
     back into development.

 10. Incomplete regression cycle before software release.

To avoid these traps, it is important to incorporate best practices
into your quality assurance process. The process should include an
evaluation of where you are with quality assurance today, what your
QA goals are, what the gaps are in the process, and finally you
should build a roadmap to obtain your goals. Only after these steps
have been taken can you avoid these quality assurance traps.

(Note: This piece was included in the VeriTest Spring 2005


                      Web Engineering Journal

The International Journal of Information Technology and Web
Engineering An official publication of the Information Resources
Management Association Published: Quarterly (Print and Electronic)
Deadline: July 15


                           David C. Rine
                   Professor of Computer Science
                      George Mason University


                         Ghazi I. Alkhatib
                       Senior Lecturer of MIS
                    Qatar College of Technology

Organizations are continuously overwhelmed by a variety of new
information technologies, many Web based.  These new technologies
are capitalizing on the widespread use of network and communication
technologies for seamless integration of various issues in
information and knowledge sharing within and among organizations.
This emphasis on integrated approaches is unique to this journal and
dictates cross platform and multidisciplinary strategy to research
and practice.

Among topics to be included (but not limited to) are the following:

- Web systems architectures, including distributed, grid computer,
and communication systems processing
- Web systems engineering design
- Web systems performance engineering studies
- Web user interfaces design, development, and usability engineering
- RFID research and applications in web engineered systems
- Mobile, location-aware, and ubiquitous computing
- Ontology and semantic Web studies
- Software agent-based applications
- Integrated user profile, provisioning, and context-based
- Security, integrity, privacy and policy issues
- Quality of service and service level agreement issues among
integrated systems
- Information filtering and display adaptation techniques for
wireless devices
- Metrics-based performance measurement of IT-based and Web-based
- Data analytics for business and government organizations
- Integrated heterogeneous and homogeneous workflows and databases
within and across  organizations and with suppliers and customers
- Case studies validating Web-based IT solutions
- Data and knowledge capture and quality issues
- Data and knowledge validation and verification
- Knowledge structure, classification and search algorithms or
- Strategies for linking business needs and IT
- IT readiness and technology transfer studies
- IT Education and Training
- Human factors and cultural impact of IT-based systems
- Virtual teams and virtual enterprises: communication, policies,
operation, creativity, and innovation
- Competitive/intelligent information systems


The International Journal of Information Technology and Web
Engineering will be published by Idea Group Inc. , publisher of Idea Group Publishing, Information Science
Publishing, IRM Press, Cybertech Publishing, and Idea Group
Reference imprints. For additional information regarding manuscript
submission and subscriptions, please contact the Co-Editor-in-Chief
at,,,  or contact the
publisher at or visit their website at


                  Innovative Testing and Feasible
           Verification & Validation of Software Systems

Testing is confronted with a demand for reducing its cost. Therefore
it should provide more explicitly its added value to its users. This
requires more intelligent testing that clearly shows its
effectiveness and efficiency. At the same time we see a growing
interest in formal methods, e.g., to address problems of software
application in the automotive industry. But the question still
remains on the feasibility of V&V methods and techniques in
industrial practices. At the VVSS2005 symposium we will address
both: Testing as well as Verification and Validation. The symposium
will include invited presentations, tool exhibitions and poster
sessions. It will be organized on the 24th of November 2005 at
Technische Universiteit Eindhoven.

VVSS 2005 is offering its speakers a unique opportunity to present
problems, solutions and experiences with Testing and Verification &
Validation of Software Systems. We invite practitioners, customers
as well as researchers for a presentation in one of our 8 tracks.
The timeframe for a presentation is 45 minutes or 30 minutes,
including 10 respectively 5 minutes for questions.

The Symposium is organized by the Laboratory for Quality Software
(Laquso; <> ) which is a joint
activity of Technische Universiteit Eindhoven and Radboud University
of Nijmegen.

               Dr. Ir. Teade Punter - ICT Consultant
              LaQuSo - Laboratory for Quality Software
          HG 5.91 Campus Technische Universiteit Eindhoven
                  P.O. Box 513, 5600 MB Eindhoven

             Tel: +31 40(0) 2472526 (direct: 2473735)

               Laquso webpage:
            Personal webpage:


                            VISSOFT 2005
                 3rd IEEE International Workshop on
        Visualizing Software for Understanding and Analysis
                         Budapest, Hungary
                        September 25th, 2005


The VISSOFT 2005 workshop will focus on visualization techniques
that draw on aspects of software maintenance, program comprehension,
reverse engineering, and reengineering.  This event will gather tool
developers, users and researchers in a unique format to learn about,
experience, and discuss techniques for visualizing software for
understanding and analysis.  The goals of the workshop are to work
towards being able to answer the question of what is a good
representation for a given situation, data availability, and
required tasks.

POSITIONN PAPERS:  We solicit thought-provoking position style
papers that will be presented and used for discussion during the
workshop.  Criteria for selection will be based on the clarity of
the paper and the appropriateness of the topic for a group
discussion.  Note we strongly encourage tool users to submit papers
that discuss what they would like to see the tool designers deliver
in their tools, as well as position papers from tool designers.
Position papers will be available in advance of the workshop and
attendees will be encouraged to browse them beforehand to improve
the discussion.   Position papers should have a maximum of 6 pages
in IEEE proceedings format.  Papers accepted to the workshop will be
published in a Computer Society Press Proceedings.  Note that the
proceedings will be printed in black and white.

TOOL DEMONSTRATIONS:  Interested tool designers should submit a
description of their tool and how it can be used to enhance
understanding.  Submissions should have a maximum of 2 pages with
additional 2 pages with the description of the demonstration and
screen captures if needed).  The number of tools demonstrated will
be limited (by the size of the room).  Criteria for selection will
be on the maturity of the tool (for example, can the participants
try it), innovative aspects and the quality of the description of
the tool and the tasks it supports.  We are particularly interested
in software visualization tools that assist in understanding the
static and dynamic aspects of the software during software
maintenance and development.

TOOL COMPETITION:  In order to encourage collaboration and
coordination between presentations, we suggest that participants
consider demonstrating their software tool using a similar software
system.  Details on the software system and available data will be
posted on the workshop website at .

Stephane Ducasse, University of Berne, Switzerland and Universiti de
Savoie, France
Michele Lanza, University of Lugano, Switzerland
Andrian Marcus, Wayne State University, USA
Jonathan I. Maletic, Kent State University, USA
Margaret-Anne Storey, University of Victoria, Canada


    ------------>>> QTN ARTICLE SUBMITTAL POLICY <<<------------

QTN is E-mailed around the middle of each month to over 10,000
subscribers worldwide.  To have your event listed in an upcoming
issue E-mail a complete description and full details of your Call
for Papers or Call for Participation at

QTN's submittal policy is:

o Submission deadlines indicated in "Calls for Papers" should
  provide at least a 1-month lead time from the QTN issue date.  For
  example, submission deadlines for "Calls for Papers" in the March
  issue of QTN On-Line should be for April and beyond.
o Length of submitted non-calendar items should not exceed 350 lines
  (about four pages).  Longer articles are OK but may be serialized.
o Length of submitted calendar items should not exceed 60 lines.
o Publication of submitted items is determined by Software Research,
  Inc., and may be edited for style and content as necessary.

DISCLAIMER:  Articles and items appearing in QTN represent the
opinions of their authors or submitters; QTN disclaims any
responsibility for their content.

TRADEMARKS:  eValid, HealthCheck, eValidation, InBrowser TestWorks,
STW, STW/Regression, STW/Coverage, STW/Advisor, TCAT, and the SR,
eValid, and TestWorks logo are trademarks or registered trademarks
of Software Research, Inc. All other systems are either trademarks
or registered trademarks of their respective companies.

        -------->>> QTN SUBSCRIPTION INFORMATION <<<--------

To SUBSCRIBE to QTN, to UNSUBSCRIBE a current subscription, to
CHANGE an address (an UNSUBSCRIBE and a SUBSCRIBE combined) please
use the convenient Subscribe/Unsubscribe facility at:


               Software Research, Inc.
               1663 Mission Street, Suite 400
               San Francisco, CA  94103  USA

               Phone:     +1 (415) 861-2800
               Toll Free: +1 (800) 942-SOFT (USA Only)
               FAX:       +1 (415) 861-9801
               Web:       <>