From PostgreSQL wiki
|Line 35:||Line 35:|
* Rails Apps:
* Rails Apps:
** SQL Alchemy
** SQL Alchemy
|Line 43:||Line 52:|
* Perl Apps:
* Perl Apps:
=== Interface Tests ===
=== Interface Tests ===
Revision as of 02:57, 12 December 2013
HOWTO Alpha and Beta Test
If you are not a code contributor (or even if you are) testing PostgreSQL Alphas and Betas is one of the best things you can do for our project. By participating in organized testing, you help get the release out faster, with more features and less bugs.
Prerequisites to Testing
A. Have a Test Machine: somewhere you can run tests on a PostgreSQL database and/or application, even if they may result in using all of your computer's system resources, or crashing. If your company uses PostgreSQL and you have a testbed for development changes, this is ideal.
B. Have a Test: an application test, interface test, performance test, and/or database feature test you can run. See below for explanations of this or suggestions.
C. Have Some Time: for testing when new Alphas or Betas come out. It doesn't have to be a lot of time, but do expect to spend a couple hours testing, and following up on questions from PostgreSQL hackers, with each release.
Types of Tests You Can Run
This is the simplest one, and precedes all of the others. Install the new version of PostgreSQL. Configure it with your favorite options and contrib modules and external tools. Does it all work?
Custom Application Tests
If you're already using PostgreSQL in production on your custom application, try porting your application to the new release and run your own application tests and performance tests. Try upgrading a copy of a populated database and seeing if there are any upgrade errors. Tell us what you find.
This is more valuable, of course, if your application has a full testing framework which allows you to unit test your application functions and its overall performance. But even doing some ad-hoc user testing of the upgrade is helpful.
General Application Tests
Some popular open source and proprietary applications come with their own test suites, unit tests, and/or performance tests. At some point, we will have a list of these; if you know one, please add it. For each of these, you can test the following things:
- Upgrading a populated database
- Create a new database using the software's installer
- Run unit tests and look for errors
- Run performance tests and look for performance regression
Open Source applications with test frameworks:
- Rails Apps:
- SQL Alchemy
- PHP Apps:
- Perl Apps:
Nobody can use PostgreSQL without a driver, and we want to make sure that all of our drivers work as well as possible with each new version of PostgreSQL. We particularly want to make sure that new features don't break these drivers. Fortunately, many drivers have regression tests.
So, download the latest version of your favorite driver (and maybe some others you don't like as much, but are still current/popular) point them at the new release of PostgreSQL, and run the regression tests. Then, build a simple database with some of the new release's features, and try to query them; do new syntax or new data types break the driver?
Another critical thing to check for is performance regressions and improvements. The difficulty about testing for this is that you really need to be sure to conduct your tests in a manner which produces repeatable results, or the performance metrics aren't worthwhile. This means that you should run them on a system which has nothing else running other than the test, you should run the test several times and see if results are consistent, etc. You also need to run the test or benchmark both against the previous version (e.g. 8.4.4) and the current alpha or beta.
Performance tests you can run include:
- An application performance benchmark, if your custom or public application has one.
- pgbench. Instructions on getting useful results from pgbench up later.
- Real benchmarks like DBT2, Spec's EAStress, etc. These require a large investment in time and server hardware to run.
- Specific task peformance; try running an operation which you know strains PG's resources (like a 40-table JOIN or an ELT operation) or one which is supposed to be improved in the new release.
If you start running performance tests, it's very important that you stick with it for the whole alpha/beta cycle so that we can see if we're improving and/or fixing any problems.
Every alpha or beta release has new features. While our contributors do a good job of testing each feature in isolation, the combinatorics of testing each new feature with every other new feature, as well as with old features, is impossible for the hackers. Also, we're so close to the code that we can miss leaving major things out of the docs. This is where you come in. Things to test:
- Pick any new user-facing feature you've never heard of before. Try to use it based entirely on the docs. Report results, problems, and suggested doc changes.
- Pick any two or three new features which could be combined somehow ( e.g. join removal and partition joins, or DefaultACLs and column triggers). Try to run a script or query which combines these features. Look for errors or non-spec or confusing behavior. Report both the testing procedure and the results.
- Pick any new feature and an old feature which was either complex (e.g. Tsearch2) or added in the last few versions (e.g. Windowing Functions). Combine them, report how you did it and the results.
Again, it's important here that you follow up on fixes for any issues you report.
How to Test
1. The first time you're going to run a particular test, you might want to post it to pgsql-testers so that other testers can suggest tweaks to make the test more useful.
2. Script the test so that you can reproduce it.
3. Run the test and collect the results to a log.
3.a. Run the test on an older version for comparison, if applicable.
4. Copy the e-mail template below and post the results using the template to pgsql-testers.
5. Look for follow-up on your test and respond to it.
6. Wait for the next release and run the test again.
You can report tests by email. You can subscribe to any PostgreSQL mailing list from the subscription form.
- pgsql-bugs: this is the preferred mailing list if you think you have found a bug in the beta. You can also use the Bug Reporting Form.
- pgsql-hackers: bugs, questions, and successful test reports are welcome here if you are already subscribed to pgsql-hackers. Note that pgsql-hackers is a high-traffic mailing list with a lot of development discussion.
What to report
Please try to include all of the following information in your email:
What goes in the fields:
Name Your name, so that we can track who is doing which tests, so if you use a nickname, please be consistent. For that matter, if you do a lot of beta-testing you may be credited in the release notes and/or the contributors page, so you may want to give us your real name.
Release: the specific release you tested. Required.
Test Type: What kind of test was it? Mark multiple options if it was a combination test.
Test Detail: A short description of the test you ran. e.g., "SQLAlchemy Unit Test Suite" or "pgBench" or "Combined sync replication and unlogged tables". Required.
Platform: Your operating system.
Installation Method: how did you install the alpha or beta release?
Platform Detail: A brief description of the platform you ran the test on. e.g. "MacBook Snow Leopard 64-bit" or "Linux 16-core Sun 4600 with NAS storage". Optional.
Test Procedure: steps to reproduce your test, if it's not immediately obvious. Optional.
Failure: Did the test show a compatibility issue, an error, or a performance regression? Or anything else the PostgreSQL hackers should take a look at? Required.
Results: A narrative of how you ran the test and what happened. If you experienced errors, please paste detailed error messages.
Comments: Anything else relevant which wasn't included above. Optional.