Wednesday, October 3, 2007

I talked to one of my best friend after 15 years

I am very very happy today as Mritunjaya called me. When There was Birthday Celebration Happening, He was calling me and I was cancelling the call for 5 times. Then I recieved the call, But could not hear properly who was that, So I told him that, I am in a meeting, And will call him later.

I can feel, what mritunjaya might think..I am calling to Ashish After 15 years !!!! And He is unable to talk to me. Then I called him on his mobile. As soon he told, I am mritunjaya, I was extremely happy. In the last 15 years, I had tried several times to find his contacts, But could not get. After a good effort He could get my contact number and I was .. No words..

Let the world know, That he was the only person during my school days, with whom I was so much attached. In every wrong or right, He used to be with me. For some home works, I used to solve maths for him [He told on phone, That I got my math solver again..].

Thanks to technology, That we could get contact with each other. When I will go to home in November, I will definately meet him.

I am not sure, How he looks now, But I can still visuallise his childish face. So many things happened, in 15 years.

Hey Mritunjaya, I am missing you a lot and want to meet you as soon as possible. Today I want to celebrate. There was some good official announcements too today. So Its a really good day today.

Monday, October 1, 2007

To be a better tester

I felt, To be a better tester, One should have these soft skills [List is minimal and I will add later] and this is completely my ideas and other may or may not agree..

1. Accountability
2. Responsibility
3. Communication Skills
4. Confidentiality
5. Dareness to say unpleasent things.
6. Hard worker
7. Should not get irritated with same repeated work.
8. Respect every one but never be like "yes sir" attitude even if other person is wrong, whoever the other person may be.
9. Passionate about learning new technology
10. Flexible enough in work assigned, Time to work and team to work with.

Lastly, I felt from todays monday musings, that If you are getting criticised, you feel you are progressing.

Tuesday, September 25, 2007

India beats Pakistan in Twenty20 World Cup Final

India bowled out Pakistan with three balls remaining Monday to win the first Twenty20 World Cup final by five runs.
India hit 157 for five wickets at Wanderers Stadium before reducing Pakistan to 104-7 off 16 overs. Sohail Tanvir then hit 12 runs off four balls with two huge sixes to bring Pakistan back in contention.
Tanvir fell to make it 138-8 and Pakistan lost its last three wickets for 14 runs, Misbah-ul-Haq holing out to a catch by Sreesanth for 43 to end the match.
"We thought we would be able to chase between 150 and 160 easily," Pakistan captain Shoaib Malik said. "But we lost early wickets, and that didn't help us."
Rudra Pratap Singh and Ifran Pathan each took three wickets from their four overs for a combined total of just 42 runs, but it was Joginder Sharma who was asked to bowl the final over.
With Pakistan needing 13 runs off the last six balls, the rookie medium pacer opened with a horrible wide. He then bowled a dot ball before conceding a six. But he held his nerve and lured Misbah-ul-Haq into an attempted paddle, which went high into the air and straight to Sreesanth at fine leg inside the circle.
The catch gave India its first international title since it won the 1983 World Cup.
"We really wanted to do well because we played so badly in the World Cup," India captain Mahendra Singh Dhoni said. "And it was a real team effort that won the tournament for us."
India batted after Dhoni won the toss and mostly owed its total to 75 off 54 balls from opener Gautam Gambhir, who struck eight fours and two sixes.
His partners fell regularly and he was the last batsman dismissed when he was caught at short fine leg by Mohammad Asif off Umar Gul at the end of the 18th over.
India again struggled to accelerate its innings, with Yuvraj Singh - who smashed six sixes off an over against England - contained by the slow spin bowling of Shahid Afridi and Mohammad Hafeez.
Yuvraj made only 14 off 19 balls, and it was left to Rohit Sharma to hit an undefeated 30 off 16 balls at the end to lift the total to a defendable level.
"We should have got 180," Dhoni said. "But Pakistan's bowling was really excellent, especially Umar Gul."
Gul was the best of the Pakistani bowlers with 3-28.
Pakistan then started its innings badly, losing Mohammad Hafeez in the first over to Singh, but Imran Nazir compensated for that poor start by taking 21 runs off Sreesanth's first over.
With wickets falling regularly and Nazir run out for 33 by a great direct hit from Robin Uthappa, Misbah-ul-Haq had to try and lift his team with 43 off 38 balls.
Tailender Tanvir also hit two sixes right at the end, but Pathan led the Indian bowlers with 3-16, while Singh's 3-26 helped set up the middle-overs squeeze which eventually led to the victory.
"For us, the turning point was Imran Nazir's run out - we were always struggling after that," Malik said.
He also condoned Shahid Afridi's dismissal for a duck, caught in the deep for a duck off his first ball.
"That's the way Shahid plays; he is player of the tournament, so it's an honour to have him in our team," he said.

Saturday, September 22, 2007

Testing Mistakes

Note: This text is taken from

The role of testing
  • Thinking the testing team is responsible for assuring quality.
  • Thinking that the purpose of testing is to find bugs.
  • Not finding the important bugs.
  • Not reporting usability problems.
  • No focus on an estimate of quality (and on the quality of that estimate).
  • Reporting bug data without putting it into context.
  • Starting testing too late (bug detection, not bug reduction)

Planning the complete testing effort

  • A testing effort biased toward functional testing.
  • Underemphasizing configuration testing.
  • Putting stress and load testing off to the last minute.
  • Not testing the documentation
  • Not testing installation procedures.
  • An overreliance on beta testing.
  • Finishing one testing task before moving on to the next.
  • Failing to correctly identify risky areas.
  • Sticking stubbornly to the test plan.

Personnel issues

  • Using testing as a transitional job for new programmers.
  • Recruiting testers from the ranks of failed programmers.
  • Testers are not domain experts.
  • Not seeking candidates from the customer service staff or technical writing staff.
  • Insisting that testers be able to program.
  • A testing team that lacks diversity.
  • A physical separation between developers and testers.
  • Believing that programmers can't test their own code.
  • Programmers are neither trained nor motivated to test.

The tester at work

  • Paying more attention to running tests than to designing them.
  • Unreviewed test designs.
  • Being too specific about test inputs and procedures.
  • Not noticing and exploring "irrelevant" oddities.
  • Checking that the product does what it's supposed to do, but not that it doesn't do what it isn't supposed to do.
  • Test suites that are understandable only by their owners.
  • Testing only through the user-visible interface.
  • Poor bug reporting.
  • Adding only regression tests when bugs are found.
  • Failing to take notes for the next testing effort.

Test automation

  • Attempting to automate all tests.
  • Expecting to rerun manual tests.
  • Using GUI capture/replay tools to reduce test creation cost.
  • Expecting regression tests to find a high proportion of new bugs.

Code coverage

  • Embracing code coverage with the devotion that only simple numbers can inspire.
  • Removing tests from a regression test suite just because they don't add coverage.
  • Using coverage as a performance goal for testers.
  • Abandoning coverage entirely.

Wednesday, September 12, 2007

Important Declaration [Please Read Before Commenting]

Hi All,

Most of the technical contents of my blog are taken directly from various web seraches or blogs. As I started writing blog very recently, I didn't notice who wrote this particular content during web surfing. Some of the contents which I remember is taken from James Bach's blog
From now onwards I will keep track of author and will mention the appropriate credit to them in my blog.

Thanks for your understanding.

Tuesday, September 11, 2007

Got ticket to home

I got the ticket for my home. Its great that, Vineet, rajesh and me are going together upto Howrah. After that Vineet and Rajesh will go to Hazaribagh and I will catch another train from sealdah to katihar.

Its goint to be great fun in train. Vineet will be showcasing his talent specifice to long journey in train. :)

My journey will start on 2nd Nov, I will reach my home on 5th Nov.

Software Testing Videos

These are some of my favourite links for watching videos related to Software testing
Become a Software Testing Expert
Building Testing Skills

Bhatotar, Barhara Kothi, Purnia

My Village name is Bhatotar. This is a small village having population of appx. 6000. In my village, All casts of people live together and has shown a true secular spirit.

My home comes in Brahmin Tola. There are so many people, who are well educated and well placed all over India and abroad.

My Village is well connected to other parts of purnia by rail and bus. the nearest Railway Station is Barhara Kothi, which is 5 mins walk from my home and Nearest Bus stand is Barhara Kothi which is 10 mins walk from my home.

There are various temples in my village but Kali Sthan is very famous. Here every year, Dewali celebration is really great. Most of the NRBs [Non Resident Bihari], come home on the Ocassion of Diwali.

One Can reach my home from Purnia/Katihar. To reach Katihar, You can get train/Bus from Patna/Delhi/Guwahati/Kolkata/Bangalore/Mumbai etc.

Performance Testing


This note outlines a proposed set of standard counters to monitor during a basic performance test. These include counters specific to a web server and database server. Operating system counters cover Windows servers only. Web and database counters are for MS IIS and MS SQL Server but should be useful for other technologies.

The counters included are a minimal set designed to ensure bottlenecks have not occurred during a performance test. If bottlenecks have occurred it may be appropriate to consider re-running tests with additional counters collected.

This note covers the following:

  • what statistics to collect

  • how to set up collection of monitoring counters

It does not cover how to analyse logs containing these counter values. The references below or other documentation should be consulted as required.


Two very useful references have been located in preparing this note.

Each has a number of sub-pages related to memory, CPU, network etc.

The following has guidance on what counters to collect and setup screenshots:

What statistics to collect

What to collect – all Windows servers

Memory: Available Mbytes

Memory: Pages/sec

Memory: Page Faults/sec

Network Interface: [Relevant NIC]: Output Queue Length

Network Interface: [Relevant NIC]: Bytes Received/sec

Network Interface: [Relevant NIC]: Bytes Sent/sec

Network Segment object: %Network Utilization (if present)

Physical Disk: Avg Disk Queue Length: Total

Physical Disk: Disk Bytes/sec: Total

Physical Disk: % Disk Time: Total

Processor: % Processor Time: Total

Server: Bytes Received/sec

Server: Bytes Transmitted/sec

System: Context Switches/sec

System: Threads

System: Processes

System: Processor Queue Length

TCP: Connections Established

What to collect – web servers

Internet Information Services Global: File Cache Hits %,

Internet Information Services Global: File Cache Flushes

Internet Information Services Global: File Cache Hits

Web Service: Bytes sent/sec

Web Service: Bytes received/sec

Web Service: Connection Attempts/sec

Web Service: Current Connections

Process: Working set: Inetinfo.exe (or alternative web server .exe)

What to collect – database servers

SQLServer: General Statistics: Logins/sec

SQLServer: General Statistics: User Connections

SQL Server SQL Statistics: Batch Requests/sec

SQLServer: Access Methods: Page Splits/sec

SQLServer: Buffer Manager: Buffer Cache Hit Ratio

SQLServer: Buffer Manager: Free buffers

SQLServer: Cache Manager: Cache hit ratio

SQLServer: Locks: Average Wait Time (ms)

SQLServer: Locks: Number of Deadlocks/Sec

SQLServer: Latches: Average Latch Wait Time (ms)

SQLServer: Latches: Latch Waits/Sec

SQLServer: Memory Manager: Target Server Memory (KB)

SQLServer: Memory Manager: Total Server Memory (KB)

If you need more details on what queries are being executed and so on,

How to set up collection of monitoring counters

All the above mentioned counters can be set up via the Windows Perfmon interface. For specifics of Perfmon, you are advised to consult relevant help pages, Microsoft’s web site or other readily available articles.

Perfmon provides a real-time monitor and file logging capabilities. During a performance test, set up file logging and shutdown the real-time monitor.

Where to start (local or remote)

It is possible to set up monitoring of counters remotely. This has the benefit that the logs can be started/stopped from one place, they are stored in one place, and you can use a dedicated monitoring PC which is unlikely to be stressed during the test.

This will add a little network traffic but that should not interfere with a normal performance test unless the network is already approaching utilisation limits.

The referenced documents contain more notes which may help decide whether local or remote monitoring is preferable, but they are not unanimous on the subject.

The instructions below assume remote monitoring.

What to do

The following instructions (for Windows XP) should be useful as a guide:

To set up logging of counters

Start->Control Panel->Administrative Tools->Performance

In performance logs and alerts, right click Counter logs and select New log settings

Choose a log name eg. System_Counters_Web

(log file will be named eg C:\Perflogs\System_Counters_Web_000001.blg)

General tab:

Select Add Counters.

Select Add counters from computer and specify name (or IP address) of web server.

Select Select the object, then the counter within the object.

Repeat for all required counters on that server then Close.

Verify that all required counters are displayed.

Select Sample counters every 15 seconds.

Log Files tab:

Select Log file type to Text file (comma delimited)

Schedule tab: leave defaults

When you are completed, click Apply, then click OK

Logging of the counters to the specified file should start immediately.

To stop logging:

Start->Control Panel->Administrative Tools->Performance

In performance logs and alerts, left click Counter logs and right click on the log eg. System_Counters_Web.

Then select Stop.

You can stop or change properties of the log.

Logs are green when running, red when stopped.

To start logging:

Start->Control Panel->Administrative Tools->Performance

In performance logs and alerts, left click Counter logs and right click on the log eg. System_Counters_Web.

Then select Start.

A file with a new suffix is created (eg. System_Counters_Web_000002).


It is also possible to use the Schedule tab to schedule the start and stop time of logging. If you do use Schedule, you still need to reset the start date after each log file is created (i.e. the interface does not allow Recurrence to be set up).

Software Testing Related Blogs

  • Antony Marcano's Blog

  • Ben Simo's Blog

  • Cem Kaner's Blog

  • David Gilbert's Blog

  • Elisabeth Hendrickson's blog

  • Jamie Dobson's Blog

  • Jonathan Kohl's Blog

  • Karen Johnson's Blog

  • Matthew Heusser's Blog

  • Michael Bolton's Blog

  • Mike Kelly's Blog

  • Pradeep Soundararajan's Blog

  • Sajjadul Hakim's Blog

  • Scott Barber's Blog

  • Shrini Kulkarni's Blog

  • Becker & Norlin (DIDW/CSO)

  • Bob Blakley (Burton Group)

  • Jeff Bohren (BMC)

  • Burton Group

  • Kim Cameron (Microsoft)

  • Marco Casassa Mont (HP)

  • Mark Dixon (Sun)
  • Ian Glazer (Approva)
  • Identity Crisis (Sun)

  • Shekhar Jha (Sena)

  • Nishant Kaushik (Oracle)

  • Dave Kearns (Network World)

  • Marcus Lasance (Siemens)

  • Corbin Links (LBG)

  • Jamie Lewis (Burton Group)

  • S. Lombardo (Identicentric)

  • Mark MacAuley

  • Ashraf Motiwala (Identropy)

  • MWD Advisors

  • Pat Patterson (Sun)
  • RSA Security

  • Jackson Shaw (Quest)

  • Symlabs

  • Mark Wilcox (Oracle)
  • More IdM Blogs
    & links to this blog

    Are You a Testing Expert ?

    Analyze these claims:
    1. “You should write a test plan”
    2. “It’s important that testing be repeatable”
    3. “Each test case should have an expected result”
    4. “Test automation saves money and time”
    5. “All testing is based on a model of what being tested”
    6. “Good enough quality is not good enough”
    7. “An undocumented test cannot be improved”
    8. “Exploratory testing is a useful practice”
    9. “It’s better to use the term defect than bug”
    10. “Ambiguity should be removed from requirements.”

    Non-Experts are More Likely to Say…
    – Yes, that’s what the books say.
    – This is right.
    – That is wrong.
    – I don’t know. {awkward silence}
    Experts are More Likely to Say…
    – Tell me more about the context.
    – I can think of how that might be true and I can think of how it might
    be false. Let’s think it through…
    – Let me reframe that…
    – Here are some possible answers…
    – Here’s one way I’ve solved this…
    – I don’t know. Here’s how I will find out…

    What’s Special About Testing
    There are few people around to teach you how to test.
    Most of what is taught as “testing” is unreliable or misleading folklore.
    Testing is a complex problem-solving activity.
    Learning testing on your own doesn’t cost you much, you don’t need anyone’s permission, and it generally poses no threat to life or property.

    It’s hard to know if you are doing it well.
    Good testing varies quite a lot with the context.

    Experts have…
    – Situational awareness
    – Confidence in confusion
    – Colleague network
    – Trained reflexes
    – Awareness of limitations
    – Diverse experiences
    – Relevant knowledge
    – Mental models for problem-solving

    Experts do…
    – Avoid traps and dead ends
    – Systematic inquiry
    – Confront authority and convention
    – Self-training and retraining
    – Self-criticism
    – Pattern matching on experience
    – Coherent explanations
    – Justify methodology
    – Write, speak, teach

    36 Important words for Software Tester

    1. Customers
    2. Information
    3. Developer relations
    4. Team
    5. Equipment & tools
    6. Schedule
    7. Test Items
    8. Deliverables
    9. Structures
    10. Functions
    11. Data
    12. Platforms
    13. Operations
    14. Time
    15. Capability
    16. Reliability
    17. Usability
    18. Security
    19. Scalability
    20. Performance
    21. Installability
    22. Compatibility
    23. Supportability
    24. Testability
    25. Maintainability
    26. Portability
    27. Localizability
    28. Function testing
    29. Domain testing
    30. Stress testing
    31. Flow testing
    32. Scenario testing
    33. Claims testing
    34. User testing
    35. Risk testing
    36. Automatic testing

    Top 24 replies by programmers when their programs don't work

    24. "It works fine on MY computer"
    23. "Who did you login as ?"
    22. "It's a feature"
    21. "It's WAD (Working As Designed)"
    20. "That's weird..."
    19. "It's never done that before."
    18. "It worked yesterday."
    17. "How is that possible?"
    16. "It must be a hardware problem."
    15. "What did you type in wrong to get it to crash?"
    14. "There is something funky in your data."
    13. "I haven't touched that module in weeks!"
    12. "You must have the wrong version."
    11. "It's just some unlucky coincidence."
    10. "I can't test everything!"
    9. "THIS can't be the source of THAT."
    8. "It works, but it's not been tested."
    7. "Somebody must have changed my code."
    6. "Did you check for a virus on your system?"
    5. "Even though it doesn't work, how does it feel?"
    4. "You can't use that version on your system."
    3. "Why do you want to do it that way?"
    2. "Where were you when the program blew up?"
    1. "I thought I fixed that."

    Software Engineering Proverbs

    A clever person solves a problem.
    A wise person avoids it.

    -- Einstein

    André Bensoussan once explained to me the difference between a programmer and a designer:

    "If you make a general statement, a programmer says, 'Yes, but...'
    while a designer says, 'Yes, and...'"

    No matter what the problem is,
    it's always a people problem.

    Jerry Weinberg

    Wexelblat's Scheduling Algorithm:

    Choose two:

    • Good
    • Fast
    • Cheap

    Craziness is doing the same thing and expecting a different result.

    Tom DeMarco, rephrasing Einstein, who said

    Insanity: doing the same thing over and over again and expecting different results.

    "There's no time to stop for gas, we're already late"

    -- Karin Donker

    Deming's 14 points

    1. Create constancy of purpose.
    2. Adopt the new philosophy.
    3. Cease dependence on mass inspection to achieve quality.
    4. Minimize total cost, not initial price of supplies.
    5. Improve constantly the system of production and service.
    6. Institute training on the job.
    7. Institute leadership.
    8. Drive out fear.
    9. Break down barriers between departments.
    10. Eliminate slogans, exhortations, and numerical targets.
    11. Eliminate work standards (quotas) and management by objective.
    12. Remove barriers that rob workers, engineers, and managers of their right to pride of workmanship.
    13. Institute a vigorous program of education and self-improvement.
    14. Put everyone in the company to work to accomplish the transformation.

    We know about as much about software quality problems as they knew about the Black Plague in the 1600s. We've seen the victims' agonies and helped burn the corpses. We don't know what causes it; we don't really know if there is only one disease. We just suffer -- and keep pouring our sewage into our water supply.

    -- Tom Van Vleck

    The Troops Know

    • The schedule doesn't have enough time for maintenance in it.
    • A lot of bugs get past the tests.
    • Most old code can't be maintained.

    To go faster, slow down. Everybody who knows about orbital mechanics understands that.

    -- Scott Cherf

    Everybody Knows:

    • Discipline is the best tool.
    • Design first, then code.
    • Don't patch bugs out, rewrite them out.
    • Don't test bugs out, design them out.

    Everybody Knows:

    • If you don't understand it, you can't program it.
    • If you didn't measure it, you didn't do it.

    Everybody Knows:

    If something is worth doing once, it's worth building a tool to do it.

    Your problem is another's solution;
    Your solution will be his problem.

    Everybody Knows:

    • If you've found 3 bugs in a program, best estimate is that there are 3 more.
    • 60% of product cost comes after initial shipment.

    The significant problems we face cannot be solved by the same level of thinking that created them.

    -- Albert Einstein

    On the radio the other night, Jimmy Connors said the best advice he ever got was from Bobby Riggs:

    • do it
    • do it right
    • do it right now

    It is not enough to do your best: you must know what to do, and THEN do your best.

    -- W. Edwards Deming

    A leader is best when people barely know that he exists.
    Less good when they obey and acclaim him.
    Worse when they fear and despise him.
    Fail to honor people, and they fail to honor you.
    But of a good leader, when his work is done, his aim fulfilled,
    they will say, "We did this ourselves."

    -- Lao-Tzu

    You must be the change
    You wish to see in the world

    -- Gandhi

    Experiment escorts us last,
    His pungent company
    Will not allow an axiom
    An opportunity.

    -- Emily Dickinson

    when the cart stops
    do you whip the cart
    or whip the ox?

    Q: How many QA testers does it take to change a lightbulb?
    A: QA testers don't change anything. They just report that it's dark.

    Kerry Zallar

    Q: How many software engineers does it take to change a lightbulb?
    A: Just one. But the house falls down.

    Andrew Siwko

    One test is worth a thousand opinions.

    "If you didn't write it down, it didn't happen."

    This saying is popular among scientists (doing experiments), but I believe it applies to software testing, particularly for real-time systems.

    --Larry Zana

    We reject kings, presidents, and voting.
    We believe in rough consensus and running code.

    --Dave Clark (1992)

    I am a design chauvinist. I believe that good design is magical and not to be lightly tinkered with. The difference between a great design and a lousy one is in the meshing of the thousand details that either fit or don't, and the spirit of the passionate intellect that has tied them together, or tried. That's why programming---or buying software---on the basis of "lists of features" is a doomed and misguided effort. The features can be thrown together, as in a garbage can, or carefully laid together and interwoven in elegant unification, as in APL, or the Forth language, or the game of chess.

    -- Ted Nelson

    Software is Too Important to be Left to Programmers, by Meilir Page-Jones.

    "If you think good architecture is expensive, try bad architecture."

    -- Brian Foote and Joseph Yoder

    Abraham Lincoln reportedly said that, given eight hours to chop down a tree, he'd spend six sharpening his axe.

    -- TidBITS 654, quoted by Derek K. Miller, via Art Evans

    ... while we all know that unmastered complexity is at the root of the misery, we do not know what degree of simplicity can be obtained, nor to what extent the intrinsic complexity of the whole design has to show up in the interfaces. We simply do not know yet the limits of disentanglement. We do not know yet whether intrinsic intricacy can be distinguished from accidental intricacy.

    -- E. W. Dijkstra, Communications of the ACM, Mar 2001, Vol. 44, No. 3

    You can only find truth with logic if you have already found truth without it.

    -- Gilbert Keith Chesterton (1874-1936) " The Man who was Orthodox", via Paul Black

    Software Testing Types

    * COMPATIBILITY TESTING. Testing to ensure compatibility of an application or Web site with different browsers, OSs, and hardware platforms. Compatibility testing can be performed manually or can be driven by an automated functional or regression test suite.

    * CONFORMANCE TESTING. Verifying implementation conformance to industry standards. Producing tests for the behavior of an implementation to be sure it provides the portability, interoperability, and/or compatibility a standard defines.

    * FUNCTIONAL TESTING. Validating an application or Web site conforms to its specifications and correctly performs all its required functions. This entails a series of tests which perform a feature by feature validation of behavior, using a wide range of normal and erroneous input data. This can involve testing of the product's user interface, APIs, database management, security, installation, networking, etcF testing can be performed on an automated or manual basis using black box or white box methodologies.

    * LOAD TESTING. Load testing is a generic term covering Performance Testing and Stress Testing.

    * PERFORMANCE TESTING. Performance testing can be applied to understand your application or WWW site's scalability, or to benchmark the performance in an environment of third party products such as servers and middleware for potential purchase. This sort of testing is particularly useful to identify performance bottlenecks in high use applications. Performance testing generally involves an automated test suite as this allows easy simulation of a variety of normal, peak, and exceptional load conditions.

    * REGRESSION TESTING. Similar in scope to a functional test, a regression test allows a consistent, repeatable validation of each new release of a product or Web site. Such testing ensures reported product defects have been corrected for each new release and that no new quality problems were introduced in the maintenance process. Though regression testing can be performed manually an automated test suite is often used to reduce the time and resources needed to perform the required testing.

    * SMOKE TESTING. A quick-and-dirty test that the major functions of a piece of software work without bothering with finer details. Originated in the hardware testing practice of turning on a new piece of hardware for the first time and considering it a success if it does not catch on fire.

    * STRESS TESTING. Testing conducted to evaluate a system or component at or beyond the limits of its specified requirements to determine the load under which it fails and how. A graceful degradation under load leading to non-catastrophic failure is the desired result. Often Stress Testing is performed using the same process as Performance Testing but employing a very high level of simulated load.

    * UNIT TESTING. Functional and reliability testing in an Engineering environment. Producing tests for the behavior of components of a product to ensure their correct behavior prior to system integration.

    Some Important Sites

    Finding Friends

    Media sharing

    Business Networking



    going mobile

    social voice chat

    Weird networking sites

    Random/Technical Sites

    Some technical sites

    Types of testing that we'd like not to see :)

    AGGRESSION TESTING: If this doesn't work, I'm gonna kill somebody.
    CONFESSION TESTING: Okay, okay, I did program that bug.
    CONGRESSIONAL TESTING: Are you now, or have you ever been a bug?
    DEPRESSION TESTING: If this doesn't work, I'm gonna kill myself.
    EGRESSION TESTING: Uh-oh, a bug... I'm outta here.
    DIGRESSION TESTING: Well, it should work, but let me tell you about my truck...
    EXPRESSION TESTING: #@%^&*!!!, a bug!
    OBSESSION TESTING: I'll find this bug if it is the last thing that I do.
    OPPRESSION TESTING: You will test this, now!
    POISSION TESTING: Alors! Regardez le poission!
    REPRESSION TESTING: It's not a bug, it's a feature.
    SUCCESSION TESTING: The system is dead. Long live the new system!
    SUGGESTION TESTING: Well, it works but wouldn't it be better if...
    PRESIDENTIAL TESTING: Using the definition of testing as defined in the affidavit...

    Software Testing: Some Definitions

    Perhaps we need to embrace Tester Pride and let the world know about the contributions we make. Do your friends and neighbors know what you do for a living? Do they know of the contributions you make? Probably not. As far as I know, the only tester in the world who advertises his profession to total strangers on the street is James Bach with his well-known "TESTER" license plate.

    James's license plate got me thinking. What can we say about our work that would fit comfortably on the fender of a car? Here are my suggestions for bumper stickers that just might rock the industry.

    We could start by hijacking existing bumper sticker mottos:
  • Ask me about my latest bug.

  • Honk if you love to crash software.

  • My other car is a bug.

  • Have you hugged your software tester today?

  • But those seem too lame and tame. How about emphasizing the unique mental attitudes of testers?
  • Software Testers: Always looking for trouble.

  • Software Testing is Like Fishing, But You Get Paid.

  • Software Testers: "Depraved minds...Usefully employed." ~Rex Black

  • Software Testing: Where failure is always an option.

  • Or, we could emphasize the often-unnoted contributions testers make:
  • Software Testing: When Your System Actually Has to Work

  • Software Quality: Don't ship without it.

  • I don't make software; I make software better.

  • Improving the world one bug at a time.

  • We could even support both sides of the "making and breaking" question:
  • Software Testing: You make it, we break it.

  • Software Testers don't break software; it's broken when we get it.

  • Software Testers: We break it because we care.

  • Not bad for a start, but perhaps we'd like to get in a few digs at development while we are at it:
  • To err is human; to find the errors requires a tester.

  • If developers are so smart, why do testers have such job security?

  • My software can beat up your software.

  • A good tester has the heart of a a jar on the desk.

  • But maybe that is too hard on our poor developers, and we are all in this together. What I'd like to see is developers' cars sporting the following:
  • Test is my copilot.

  • If your software works, thank a tester.

  • Or, we could even support positions within our own testing community. I work with test automation and spec-based test generation most of the time, so how about these:
  • Old Model-Based Testers Never Die; They Just Transition to a Higher State.

  • Life is too short for manual testing.

  • Friends don't let friends do capture-replay.

  • Support spec-based testing: Be code-dependent no more!

  • People should think and machines should test.

  • Test never sleeps.

  • There can be lofty sentiments for those idealists among us:
  • Visualize Great Software

  • And some not-so-lofty sentiments for those whose ideals have taken a beating:
  • Trust, But Verify.

  • Truthfully, though, I am a tester because that is what I have always been, even when I was a kid. I have always asked awkward questions that I felt needed to be asked. I always looked for answers I could be satisfied with. So, the bumper sticker that sums it up for me would be:
  • Pertempto ergo sum – I test, therefore I am.
  • My First Blog : Ashish Kumar Jha

    Hi All,

    Welcome to my Blog. Here I will share my thoughts and knowledge.

    I have Created This Blog In Order To Consolidate my Several Blogs. From Now Onwards, I will Use This Blog Mostly.