This blog is my personal web diary where I write anything happened around me and any learnings. I work as a software engineer in Hyderabad. I have great positive attitude and enthusiasm. I believe nothing is impossible. (*Conditions Apply)
Tuesday, September 25, 2007
India beats Pakistan in Twenty20 World Cup Final
India bowled out Pakistan with three balls remaining Monday to win the first Twenty20 World Cup final by five runs.
India hit 157 for five wickets at Wanderers Stadium before reducing Pakistan to 104-7 off 16 overs. Sohail Tanvir then hit 12 runs off four balls with two huge sixes to bring Pakistan back in contention.
Tanvir fell to make it 138-8 and Pakistan lost its last three wickets for 14 runs, Misbah-ul-Haq holing out to a catch by Sreesanth for 43 to end the match.
"We thought we would be able to chase between 150 and 160 easily," Pakistan captain Shoaib Malik said. "But we lost early wickets, and that didn't help us."
Rudra Pratap Singh and Ifran Pathan each took three wickets from their four overs for a combined total of just 42 runs, but it was Joginder Sharma who was asked to bowl the final over.
With Pakistan needing 13 runs off the last six balls, the rookie medium pacer opened with a horrible wide. He then bowled a dot ball before conceding a six. But he held his nerve and lured Misbah-ul-Haq into an attempted paddle, which went high into the air and straight to Sreesanth at fine leg inside the circle.
The catch gave India its first international title since it won the 1983 World Cup.
"We really wanted to do well because we played so badly in the World Cup," India captain Mahendra Singh Dhoni said. "And it was a real team effort that won the tournament for us."
India batted after Dhoni won the toss and mostly owed its total to 75 off 54 balls from opener Gautam Gambhir, who struck eight fours and two sixes.
His partners fell regularly and he was the last batsman dismissed when he was caught at short fine leg by Mohammad Asif off Umar Gul at the end of the 18th over.
India again struggled to accelerate its innings, with Yuvraj Singh - who smashed six sixes off an over against England - contained by the slow spin bowling of Shahid Afridi and Mohammad Hafeez.
Yuvraj made only 14 off 19 balls, and it was left to Rohit Sharma to hit an undefeated 30 off 16 balls at the end to lift the total to a defendable level.
"We should have got 180," Dhoni said. "But Pakistan's bowling was really excellent, especially Umar Gul."
Gul was the best of the Pakistani bowlers with 3-28.
Pakistan then started its innings badly, losing Mohammad Hafeez in the first over to Singh, but Imran Nazir compensated for that poor start by taking 21 runs off Sreesanth's first over.
With wickets falling regularly and Nazir run out for 33 by a great direct hit from Robin Uthappa, Misbah-ul-Haq had to try and lift his team with 43 off 38 balls.
Tailender Tanvir also hit two sixes right at the end, but Pathan led the Indian bowlers with 3-16, while Singh's 3-26 helped set up the middle-overs squeeze which eventually led to the victory.
"For us, the turning point was Imran Nazir's run out - we were always struggling after that," Malik said.
He also condoned Shahid Afridi's dismissal for a duck, caught in the deep for a duck off his first ball.
"That's the way Shahid plays; he is player of the tournament, so it's an honour to have him in our team," he said.
Source: http://canadianpress.google.com/article/ALeqM5hoRc_-gspv5Tq9IpQrxPHIc9gXgQ
Saturday, September 22, 2007
Testing Mistakes
The role of testing
- Thinking the testing team is responsible for assuring quality.
- Thinking that the purpose of testing is to find bugs.
- Not finding the important bugs.
- Not reporting usability problems.
- No focus on an estimate of quality (and on the quality of that estimate).
- Reporting bug data without putting it into context.
- Starting testing too late (bug detection, not bug reduction)
Planning the complete testing effort
- A testing effort biased toward functional testing.
- Underemphasizing configuration testing.
- Putting stress and load testing off to the last minute.
- Not testing the documentation
- Not testing installation procedures.
- An overreliance on beta testing.
- Finishing one testing task before moving on to the next.
- Failing to correctly identify risky areas.
- Sticking stubbornly to the test plan.
Personnel issues
- Using testing as a transitional job for new programmers.
- Recruiting testers from the ranks of failed programmers.
- Testers are not domain experts.
- Not seeking candidates from the customer service staff or technical writing staff.
- Insisting that testers be able to program.
- A testing team that lacks diversity.
- A physical separation between developers and testers.
- Believing that programmers can't test their own code.
- Programmers are neither trained nor motivated to test.
The tester at work
- Paying more attention to running tests than to designing them.
- Unreviewed test designs.
- Being too specific about test inputs and procedures.
- Not noticing and exploring "irrelevant" oddities.
- Checking that the product does what it's supposed to do, but not that it doesn't do what it isn't supposed to do.
- Test suites that are understandable only by their owners.
- Testing only through the user-visible interface.
- Poor bug reporting.
- Adding only regression tests when bugs are found.
- Failing to take notes for the next testing effort.
Test automation
- Attempting to automate all tests.
- Expecting to rerun manual tests.
- Using GUI capture/replay tools to reduce test creation cost.
- Expecting regression tests to find a high proportion of new bugs.
Code coverage
- Embracing code coverage with the devotion that only simple numbers can inspire.
- Removing tests from a regression test suite just because they don't add coverage.
- Using coverage as a performance goal for testers.
- Abandoning coverage entirely.
Wednesday, September 12, 2007
Important Declaration [Please Read Before Commenting]
Most of the technical contents of my blog are taken directly from various web seraches or blogs. As I started writing blog very recently, I didn't notice who wrote this particular content during web surfing. Some of the contents which I remember is taken from James Bach's blog
From now onwards I will keep track of author and will mention the appropriate credit to them in my blog.
Thanks for your understanding.
Tuesday, September 11, 2007
Got ticket to home
Its goint to be great fun in train. Vineet will be showcasing his talent specifice to long journey in train. :)
My journey will start on 2nd Nov, I will reach my home on 5th Nov.
Software Testing Videos
Become a Software Testing Expert
Building Testing Skills
Bhatotar, Barhara Kothi, Purnia
My home comes in Brahmin Tola. There are so many people, who are well educated and well placed all over India and abroad.
My Village is well connected to other parts of purnia by rail and bus. the nearest Railway Station is Barhara Kothi, which is 5 mins walk from my home and Nearest Bus stand is Barhara Kothi which is 10 mins walk from my home.
There are various temples in my village but Kali Sthan is very famous. Here every year, Dewali celebration is really great. Most of the NRBs [Non Resident Bihari], come home on the Ocassion of Diwali.
One Can reach my home from Purnia/Katihar. To reach Katihar, You can get train/Bus from Patna/Delhi/Guwahati/Kolkata/Bangalore/Mumbai etc.
Performance Testing
Introduction
This note outlines a proposed set of standard counters to monitor during a basic performance test. These include counters specific to a web server and database server. Operating system counters cover Windows servers only. Web and database counters are for MS IIS and MS SQL Server but should be useful for other technologies.
The counters included are a minimal set designed to ensure bottlenecks have not occurred during a performance test. If bottlenecks have occurred it may be appropriate to consider re-running tests with additional counters collected.
This note covers the following:
what statistics to collect
how to set up collection of monitoring counters
It does not cover how to analyse logs containing these counter values. The references below or other documentation should be consulted as required.
References
Two very useful references have been located in preparing this note.
Each has a number of sub-pages related to memory, CPU, network etc.
http://www.computerperformance.co.uk/HealthCheck/
http://www.sql-server-performance.com/tips_performance.asp
The following has guidance on what counters to collect and setup screenshots:
http://www.sql-server-performance.com/qdpma/inst_3_pmlogs.asp
What statistics to collect
What to collect – all Windows servers
Memory: Available Mbytes
Memory: Pages/sec
Memory: Page Faults/sec
Network Interface: [Relevant NIC]: Output Queue Length
Network Interface: [Relevant NIC]: Bytes Received/sec
Network Interface: [Relevant NIC]: Bytes Sent/sec
Network Segment object: %Network Utilization (if present)
Physical Disk: Avg Disk Queue Length: Total
Physical Disk: Disk Bytes/sec: Total
Physical Disk: % Disk Time: Total
Processor: % Processor Time: Total
Server: Bytes Received/sec
Server: Bytes Transmitted/sec
System: Context Switches/sec
System: Threads
System: Processes
System: Processor Queue Length
TCP: Connections Established
What to collect – web servers
Internet Information Services Global: File Cache Hits %,
Internet Information Services Global: File Cache Flushes
Internet Information Services Global: File Cache Hits
Web Service: Bytes sent/sec
Web Service: Bytes received/sec
Web Service: Connection Attempts/sec
Web Service: Current Connections
Process: Working set: Inetinfo.exe (or alternative web server .exe)
What to collect – database servers
SQLServer: General Statistics: Logins/sec
SQLServer: General Statistics: User Connections
SQL Server SQL Statistics: Batch Requests/sec
SQLServer: Access Methods: Page Splits/sec
SQLServer: Buffer Manager: Buffer Cache Hit Ratio
SQLServer: Buffer Manager: Free buffers
SQLServer: Cache Manager: Cache hit ratio
SQLServer: Locks: Average Wait Time (ms)
SQLServer: Locks: Number of Deadlocks/Sec
SQLServer: Latches: Average Latch Wait Time (ms)
SQLServer: Latches: Latch Waits/Sec
SQLServer: Memory Manager: Target Server Memory (KB)
SQLServer: Memory Manager: Total Server Memory (KB)
If you need more details on what queries are being executed and so on,
How to set up collection of monitoring counters
All the above mentioned counters can be set up via the Windows Perfmon interface. For specifics of Perfmon, you are advised to consult relevant help pages, Microsoft’s web site or other readily available articles.
Perfmon provides a real-time monitor and file logging capabilities. During a performance test, set up file logging and shutdown the real-time monitor.
Where to start (local or remote)
It is possible to set up monitoring of counters remotely. This has the benefit that the logs can be started/stopped from one place, they are stored in one place, and you can use a dedicated monitoring PC which is unlikely to be stressed during the test.
This will add a little network traffic but that should not interfere with a normal performance test unless the network is already approaching utilisation limits.
The referenced documents contain more notes which may help decide whether local or remote monitoring is preferable, but they are not unanimous on the subject.
The instructions below assume remote monitoring.
What to do
The following instructions (for Windows XP) should be useful as a guide:
To set up logging of counters
Start->Control Panel->Administrative Tools->Performance
In performance logs and alerts, right click Counter logs and select New log settings
Choose a log name eg. System_Counters_Web
(log file will be named eg C:\Perflogs\System_Counters_Web_000001.blg)
General tab:
Select Add Counters.
Select Add counters from computer and specify name (or IP address) of web server.
Select Select the object, then the counter within the object.
Repeat for all required counters on that server then Close.
Verify that all required counters are displayed.
Select Sample counters every 15 seconds.
Log Files tab:
Select Log file type to Text file (comma delimited)
Schedule tab: leave defaults
When you are completed, click Apply, then click OK
Logging of the counters to the specified file should start immediately.
To stop logging:
Start->Control Panel->Administrative Tools->Performance
In performance logs and alerts, left click Counter logs and right click on the log eg. System_Counters_Web.
Then select Stop.
You can stop or change properties of the log.
Logs are green when running, red when stopped.
To start logging:
Start->Control Panel->Administrative Tools->Performance
In performance logs and alerts, left click Counter logs and right click on the log eg. System_Counters_Web.
Then select Start.
A file with a new suffix is created (eg. System_Counters_Web_000002).
Scheduling
It is also possible to use the Schedule tab to schedule the start and stop time of logging. If you do use Schedule, you still need to reset the start date after each log file is created (i.e. the interface does not allow Recurrence to be set up).
Software Testing Related Blogs
Are You a Testing Expert ?
1. “You should write a test plan”
2. “It’s important that testing be repeatable”
3. “Each test case should have an expected result”
4. “Test automation saves money and time”
5. “All testing is based on a model of what being tested”
6. “Good enough quality is not good enough”
7. “An undocumented test cannot be improved”
8. “Exploratory testing is a useful practice”
9. “It’s better to use the term defect than bug”
10. “Ambiguity should be removed from requirements.”
Non-Experts are More Likely to Say…
– Yes, that’s what the books say.
– This is right.
– That is wrong.
– I don’t know. {awkward silence}
Experts are More Likely to Say…
– Tell me more about the context.
– I can think of how that might be true and I can think of how it might
be false. Let’s think it through…
– Let me reframe that…
– Here are some possible answers…
– Here’s one way I’ve solved this…
– I don’t know. Here’s how I will find out…
What’s Special About Testing
There are few people around to teach you how to test.
Most of what is taught as “testing” is unreliable or misleading folklore.
Testing is a complex problem-solving activity.
Learning testing on your own doesn’t cost you much, you don’t need anyone’s permission, and it generally poses no threat to life or property.
However…
It’s hard to know if you are doing it well.
Good testing varies quite a lot with the context.
Experts have…
– Situational awareness
– Confidence in confusion
– Colleague network
– Trained reflexes
– Awareness of limitations
– Diverse experiences
– Relevant knowledge
– Mental models for problem-solving
– Reputation
Experts do…
– Avoid traps and dead ends
– Systematic inquiry
– Confront authority and convention
– Self-training and retraining
– Self-criticism
– Pattern matching on experience
– Coherent explanations
– Justify methodology
– Write, speak, teach
36 Important words for Software Tester
- Customers
- Information
- Developer relations
- Team
- Equipment & tools
- Schedule
- Test Items
- Deliverables
- Structures
- Functions
- Data
- Platforms
- Operations
- Time
- Capability
- Reliability
- Usability
- Security
- Scalability
- Performance
- Installability
- Compatibility
- Supportability
- Testability
- Maintainability
- Portability
- Localizability
- Function testing
- Domain testing
- Stress testing
- Flow testing
- Scenario testing
- Claims testing
- User testing
- Risk testing
- Automatic testing
Top 24 replies by programmers when their programs don't work
23. "Who did you login as ?"
22. "It's a feature"
21. "It's WAD (Working As Designed)"
20. "That's weird..."
19. "It's never done that before."
18. "It worked yesterday."
17. "How is that possible?"
16. "It must be a hardware problem."
15. "What did you type in wrong to get it to crash?"
14. "There is something funky in your data."
13. "I haven't touched that module in weeks!"
12. "You must have the wrong version."
11. "It's just some unlucky coincidence."
10. "I can't test everything!"
9. "THIS can't be the source of THAT."
8. "It works, but it's not been tested."
7. "Somebody must have changed my code."
6. "Did you check for a virus on your system?"
5. "Even though it doesn't work, how does it feel?"
4. "You can't use that version on your system."
3. "Why do you want to do it that way?"
2. "Where were you when the program blew up?"
1. "I thought I fixed that."
Software Engineering Proverbs
A clever person solves a problem.
A wise person avoids it.
-- Einstein
André Bensoussan once explained to me the difference between a programmer and a designer:
"If you make a general statement, a programmer says, 'Yes, but...'
while a designer says, 'Yes, and...'"
No matter what the problem is,
it's always a people problem.
Wexelblat's Scheduling Algorithm:
Choose two:
- Good
- Fast
- Cheap
Craziness is doing the same thing and expecting a different result.
Tom DeMarco, rephrasing Einstein, who said
Insanity: doing the same thing over and over again and expecting different results.
"There's no time to stop for gas, we're already late"
-- Karin Donker
Deming's 14 points
- Create constancy of purpose.
- Adopt the new philosophy.
- Cease dependence on mass inspection to achieve quality.
- Minimize total cost, not initial price of supplies.
- Improve constantly the system of production and service.
- Institute training on the job.
- Institute leadership.
- Drive out fear.
- Break down barriers between departments.
- Eliminate slogans, exhortations, and numerical targets.
- Eliminate work standards (quotas) and management by objective.
- Remove barriers that rob workers, engineers, and managers of their right to pride of workmanship.
- Institute a vigorous program of education and self-improvement.
- Put everyone in the company to work to accomplish the transformation.
We know about as much about software quality problems as they knew about the Black Plague in the 1600s. We've seen the victims' agonies and helped burn the corpses. We don't know what causes it; we don't really know if there is only one disease. We just suffer -- and keep pouring our sewage into our water supply.
-- Tom Van Vleck
The Troops Know
- The schedule doesn't have enough time for maintenance in it.
- A lot of bugs get past the tests.
- Most old code can't be maintained.
To go faster, slow down. Everybody who knows about orbital mechanics understands that.
-- Scott Cherf
Everybody Knows:
- Discipline is the best tool.
- Design first, then code.
- Don't patch bugs out, rewrite them out.
- Don't test bugs out, design them out.
Everybody Knows:
- If you don't understand it, you can't program it.
- If you didn't measure it, you didn't do it.
Everybody Knows:
If something is worth doing once, it's worth building a tool to do it.
Your problem is another's solution;
Your solution will be his problem.
Everybody Knows:
- If you've found 3 bugs in a program, best estimate is that there are 3 more.
- 60% of product cost comes after initial shipment.
The significant problems we face cannot be solved by the same level of thinking that created them.
-- Albert Einstein
On the radio the other night, Jimmy Connors said the best advice he ever got was from Bobby Riggs:
- do it
- do it right
- do it right now
It is not enough to do your best: you must know what to do, and THEN do your best.
-- W. Edwards Deming
A leader is best when people barely know that he exists.
Less good when they obey and acclaim him.
Worse when they fear and despise him.
Fail to honor people, and they fail to honor you.
But of a good leader, when his work is done, his aim fulfilled,
they will say, "We did this ourselves."
-- Lao-Tzu
You must be the change
You wish to see in the world
-- Gandhi
Experiment escorts us last,
His pungent company
Will not allow an axiom
An opportunity.
-- Emily Dickinson
when the cart stops
do you whip the cart
or whip the ox?
Q: How many QA testers does it take to change a lightbulb?
A: QA testers don't change anything. They just report that it's dark.
Q: How many software engineers does it take to change a lightbulb?
A: Just one. But the house falls down.
Andrew Siwko
One test is worth a thousand opinions.
"If you didn't write it down, it didn't happen."
This saying is popular among scientists (doing experiments), but I believe it applies to software testing, particularly for real-time systems.
We reject kings, presidents, and voting.
We believe in rough consensus and running code.
--Dave Clark (1992)
I am a design chauvinist. I believe that good design is magical and not to be lightly tinkered with. The difference between a great design and a lousy one is in the meshing of the thousand details that either fit or don't, and the spirit of the passionate intellect that has tied them together, or tried. That's why programming---or buying software---on the basis of "lists of features" is a doomed and misguided effort. The features can be thrown together, as in a garbage can, or carefully laid together and interwoven in elegant unification, as in APL, or the Forth language, or the game of chess.
-- Ted Nelson
Software is Too Important to be Left to Programmers, by Meilir Page-Jones.
"If you think good architecture is expensive, try bad architecture."
-- Brian Foote and Joseph Yoder
Abraham Lincoln reportedly said that, given eight hours to chop down a tree, he'd spend six sharpening his axe.
-- TidBITS 654, quoted by Derek K. Miller, via Art Evans
... while we all know that unmastered complexity is at the root of the misery, we do not know what degree of simplicity can be obtained, nor to what extent the intrinsic complexity of the whole design has to show up in the interfaces. We simply do not know yet the limits of disentanglement. We do not know yet whether intrinsic intricacy can be distinguished from accidental intricacy.
-- E. W. Dijkstra, Communications of the ACM, Mar 2001, Vol. 44, No. 3
You can only find truth with logic if you have already found truth without it.
-- Gilbert Keith Chesterton (1874-1936) " The Man who was Orthodox", via Paul Black
Software Testing Types
* CONFORMANCE TESTING. Verifying implementation conformance to industry standards. Producing tests for the behavior of an implementation to be sure it provides the portability, interoperability, and/or compatibility a standard defines.
* FUNCTIONAL TESTING. Validating an application or Web site conforms to its specifications and correctly performs all its required functions. This entails a series of tests which perform a feature by feature validation of behavior, using a wide range of normal and erroneous input data. This can involve testing of the product's user interface, APIs, database management, security, installation, networking, etcF testing can be performed on an automated or manual basis using black box or white box methodologies.
* LOAD TESTING. Load testing is a generic term covering Performance Testing and Stress Testing.
* PERFORMANCE TESTING. Performance testing can be applied to understand your application or WWW site's scalability, or to benchmark the performance in an environment of third party products such as servers and middleware for potential purchase. This sort of testing is particularly useful to identify performance bottlenecks in high use applications. Performance testing generally involves an automated test suite as this allows easy simulation of a variety of normal, peak, and exceptional load conditions.
* REGRESSION TESTING. Similar in scope to a functional test, a regression test allows a consistent, repeatable validation of each new release of a product or Web site. Such testing ensures reported product defects have been corrected for each new release and that no new quality problems were introduced in the maintenance process. Though regression testing can be performed manually an automated test suite is often used to reduce the time and resources needed to perform the required testing.
* SMOKE TESTING. A quick-and-dirty test that the major functions of a piece of software work without bothering with finer details. Originated in the hardware testing practice of turning on a new piece of hardware for the first time and considering it a success if it does not catch on fire.
* STRESS TESTING. Testing conducted to evaluate a system or component at or beyond the limits of its specified requirements to determine the load under which it fails and how. A graceful degradation under load leading to non-catastrophic failure is the desired result. Often Stress Testing is performed using the same process as Performance Testing but employing a very high level of simulated load.
* UNIT TESTING. Functional and reliability testing in an Engineering environment. Producing tests for the behavior of components of a product to ensure their correct behavior prior to system integration.
Some Important Sites
www.myspace.com
www.orkut.com
www.facebook.com
www.spaces.live.com
www.batchmates.com
www.yaari.com
www.minglebox.com
www.ning.com
www.meetup.com
www.bebo.com
www.fropper.com
www.bigadda.com
Media sharing
www.youtube.com
www.flickr.com
www.panoramio.com
www.fotothing.com
www.yuntaa.com
www.dailymotion.com
www.blogtv.com
www.cylive.com
www.twango.com
www.ishare.rediff.com
www.zooomr.com
www.wetpaint.com
www.india.takingglobal.com
www.shelfari.com
www.ezmo.com
www.last.fm
www.broadcaster.com
www.cardomain.com
www.care2.com
www.couchsurfing.com
www.geni.com
www.ittoolbox.com
www.dzone.com
www.motortopia.com
www.oktatabyebye.com
www.dogster.com
www.flixster.com
www.mypunchbowl.com
Business Networking
www.bizlink.com
www.linkedin.com
www.ryze.com
www.peoplethatclick.com
www.pairup.com
www.socialpicks.com
www.wxing.com
Educational
www.in.answers.yahoo.com
www.3form.com
www.iq.lycos.co.uk
www.answerbag.com
www.whyville.com
www.groups.google.com
www.labs.google.com
www.experts-exchange.com
www.donationcoder.com
www.uclue.com
Bookmarking
www.del.icio.us
www.stumbleupon.com
www.digg.com
www.diigo.com
www.ulinkx.com
www.citeulike.org
www.blogg-buzz.com
www.librarything.com
www.clipmark.com
going mobile
www.tezaa.com
www.in.wedja.com
www.frenzo.in
www.in.wap.yahoo.com
social voice chat
www.sociallight.com
www.monospace.com/wap2
www.mob5.com
www.m.twitter.com
www.mobile.mobleo.net
Weird networking sites
www.smallworld.columbia.edu
www.matchadream.com
www.imagini.net
www.kiva.org
www.stardoll.com
www.43people.com
www.gopets.net
www.gothpassions.com
Random/Technical Sites
http://googletesting.blogspot.com/
http://360tek.blogspot.com/2007/04/what-is-directory-ldap-ad-and-adam.html
http://blogs.oracle.com/talkingidentity/
http://blogs.sun.com/superpat/category/Identity
Some technical sites
http://www.aptest.com/resources.html
http://www.softwaretestingwiki.com/doku.php
http://identityaccessman.blogspot.com
http://prashant-iam.blogspot.com
http://brandandmarket.blogspot.com
http://www.brandingstrategyinsider.com
http://www.dishant.com/album
Types of testing that we'd like not to see :)
AGGRESSION TESTING: | If this doesn't work, I'm gonna kill somebody. |
COMPRESSION TESTING: | [] |
CONFESSION TESTING: | Okay, okay, I did program that bug. |
CONGRESSIONAL TESTING: | Are you now, or have you ever been a bug? |
DEPRESSION TESTING: | If this doesn't work, I'm gonna kill myself. |
EGRESSION TESTING: | Uh-oh, a bug... I'm outta here. |
DIGRESSION TESTING: | Well, it should work, but let me tell you about my truck... |
EXPRESSION TESTING: | #@%^&*!!!, a bug! |
OBSESSION TESTING: | I'll find this bug if it is the last thing that I do. |
OPPRESSION TESTING: | You will test this, now! |
POISSION TESTING: | Alors! Regardez le poission! |
REPRESSION TESTING: | It's not a bug, it's a feature. |
SUCCESSION TESTING: | The system is dead. Long live the new system! |
SUGGESTION TESTING: | Well, it works but wouldn't it be better if... |
PRESIDENTIAL TESTING: | Using the definition of testing as defined in the affidavit... |
Software Testing: Some Definitions
James's license plate got me thinking. What can we say about our work that would fit comfortably on the fender of a car? Here are my suggestions for bumper stickers that just might rock the industry.
We could start by hijacking existing bumper sticker mottos:
But those seem too lame and tame. How about emphasizing the unique mental attitudes of testers?
Or, we could emphasize the often-unnoted contributions testers make:
We could even support both sides of the "making and breaking" question:
Not bad for a start, but perhaps we'd like to get in a few digs at development while we are at it:
But maybe that is too hard on our poor developers, and we are all in this together. What I'd like to see is developers' cars sporting the following:
Or, we could even support positions within our own testing community. I work with test automation and spec-based test generation most of the time, so how about these:
There can be lofty sentiments for those idealists among us:
And some not-so-lofty sentiments for those whose ideals have taken a beating:
Truthfully, though, I am a tester because that is what I have always been, even when I was a kid. I have always asked awkward questions that I felt needed to be asked. I always looked for answers I could be satisfied with. So, the bumper sticker that sums it up for me would be: