PostgreSQL testing brainstorming: PGCon2013 unconference notes

From PostgreSQL wiki

Revision as of 18:32, 25 May 2013 by Dbs (Talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to: navigation, search

Build farm - composed of "animals"

  • Currently run multiple instances per animal; each animal configures which contrib modules it builds (usually based on platform restrictions) but all tests are run
  • No support for machine-to-machine testing at the moment (you can, however, run replication across two local instances)
  • Peter Eisentraut had been doing a lot of work with Jenkins

Question around collecting data on where bugs are being found and fixed to determine a heat map of problem areas; discussion goes sideways about the conscious lack of a bug tracker for the project; possibility of using tagging in the bug emails to enable some level of analytics

Testing of clients (beyond psql); libpq is tested and that's what Python, PHP, Perl, etc generally build on, but there is likely not significant automated testing happening with these clients; however, that's generally considered out of scope for the PostgreSQL project proper

Automated fuzz testing of generated SQL, etc is apparently ongoing but the project only hears about it when they find a problem

Performance farm

  • Would like to have overall performance testing (with throughput / stress testing); Mark Wong is already doing this on a biweekly basis with DBT2 (because setup / teardown is hard)
  • Would like to have individual SQL query regression testing to prevent future fiascos like the 9.2 subquery performance regression
  • Only thing that's standing in the way is _time_ to write the performance testing infrastructure; hardware _is_ available (and can be dedicated for performance purposes to provide consistency)
  • Build farm already has the ability to report basic profiling information, things like iozone would be useful to help filter out weirdness
Personal tools