Well day two here at PgEast has drawn to a close and it was another
very informative day.
Today I concentrated on the more common tasks of a Pg DBA so I attended three
talks (four if you count mine) that where rather heavy on the technical side of being a Pg DBA
Keven Kempter drew me back again with his excellent talk on Backup and recovery methods
this time giving some very good advice on how to use and abuse of pg_Dump_all and
PG_restore. He also touched on three different recipes PITR on ProstgreSQL and gave some handy
advice on when and why to use it.
I also caught another Mongo talk this time by Steve Francia it was on the application of Mongo
in a real world web retail store. He presented a very convincing argument for the NoSQL side of things in
the retail realm namely that RDBMS works great when you have but a few similar products
such as books, CDs and movies but what if you are a retailer who sells Jeans, Watches, Fresh fruit as well.
Mongo allows for a completely flexible schema and his fits the diverse retail model well.
Another good point he made is in the archiving pf transactions. Taking a product return
as an example one has to keep a recoded of all the details of the sale and you have to have some sort of mechanism to reconcile data points such as the price which might be stale they the time of the return. In
the Mongo world one just keeps the original sales record. A rather elegant solution.
I rounded out the day with two technical Pg talks the first by Magnus Hagander gave a very informative talk on the differing approaches to the ‘caching problem of web applications’ with PostgreSQL. By leveraging PostgreSQL notification system one can easily build a very robust and scalable cache with another open commercial product called Varnish.
I rounded out the day with an talk on Migrating from MySQL to PostgreQSL given by Paul Gross. This was an interesting case study of his experience where he was constrained by a 0 down time requirement. His solution was to use the ORM he was familiar with ‘ActiveRecord’ and use that to solve the many problems with data conversion he was encountering using just Ruby on its own. He used an iterative approach where he ran a script over and at each pass gathering up any changes in the originating MySQL into the PostgreSQL until they where exactly the same. This was successful with only a 30 second blackout time during the switchover to the new DB.
Well that is about it for day.
3 Comments. Leave new
Hint for the MongoDB folks:
Retailers that sell everything are not new and has been around forever.
Walmart, Sears, Macy’s, Amazon, eBay – all those are very happy with Oracle.
The relational model is much more flexible than anyone would expect.
Of course MongoDB folks wouldn’t know it – they never worked with a database that isn’t MySQL and have very little understanding of the relational model and the problems it is intended to solve.
Well you know Gwen even an old RDBMS hack like me has to admit the Mongo solution makes child’s play out of all the hoops and barrels one has to jump thought and over to get a RDBMS to work in this way.
In the end once the ‘Grand Fromage’ at your Wallmart and Macy’s figure out something like MongoDB can do what they want on time and on the cheap you will see a shift.
John,
I’m glad you enjoyed my presentation.
I’ve put the slides here for anyone who would like to see the presentation.
https://spf13.com/post/augmenting-rdbms-with-nosql-for-e-commerce
Gwen,
I’ve used many of the relational databases including Oracle (DB2, MSSQL, PostgreSQL, Sybase, MySQL and more). In the presentation I even address about how Oracle provides a solution to this problem. I think the presentation provides a compelling case for using MongoDB to overcome the limitations of RDBMS. Not everyone can afford to use Oracle and I think MongoDB is a more elegant solution to this problem.
Once you’ve read the presentation I’d love some constructive feedback.