I thought that after our last meeting that I might have a go in 'whacking' in the data from the Jefferies Database book that you gave me and playing around with it.
But with stock coverage running to a massive 62 pages long I had second thoughts!
That of course is one of the potential strengths of the Jefferies Database, in terms of breadth and depth of the stock coverage Jefferies Database, there is a lot to work with here, with lots of US sectors being very well coverage.
The first task would be to see if some sort of electronic version exists. I imagine there must be at least one spreadsheet somewhere, which is regularly updated somewhere to publish this document, possibly more with more data than is published here. Getting hold of that on a regular basis would be a great start and it would enable you to take the basic data, update earnings regularly by cutting and pasting new estimates over it, and building it out by adding new lines/pages of data from other sources/calcuations.
If breadth of stock coverage is an obvious strength of what you have already, the biggest weakness is a big lack of uniformity in the data covered for each sector. A simple comparison of the data items displayed for the two sectors in the samples included here shows the problem.
The data included in common across all sectors is extremely limited and comprises not much more than 'Recommendation', 'Target Price', 'Price', 'High', 'Low','Market Cap', 'No. of Shares', 'EPS' estimates and 'P/E', 'Ticker' and 'FYE', 'Financial Year End'.
Just about all the other data chosen to be displayed varies wildly by sector.
In the two examples shown above, for one sector the addition data includes
1. Operation Income Growth Rate
2. EBIT Margin
3. Working Cap/Sales
4. ROIC
5. Net debt
For the other sector
1. Revenues
2. EV/Revenues
3. Price/Book
4. Book Value
have been selected. ie. not a single item in common! Nor are these two models by any means the only combination of data, which varies in just about every sector.
I'm sure these addition data items have been chosen on the basis of what the Head of the Sector Team considers most important for the sector that they are following. But unless their is some other spreadsheet somewhere with data items gathered in a more systematic and uniform way, this is not going to give you very much for comparing stocks across sector/aggregating. In fact, in terms of performance and valuation data, future projected earnings growth and P/E are about your lot!
There are probably next steps to consider are:
1. Loading up the data you have into a spreadsheet or similar- preferably directly from a spreadsheet for which you will be able to receive regular updates.
2. Expanding the data with a second/several additional sheets of data, fed by links for say 'Price', 'Price performance data' -whose absence speaks volumes about the 'unloved' nature of this database, as knowing the valuation stats in the absence of share price movements makes it pretty dull document-and stuff like earnings relative to consensus, changes in earnings estimates etc.
3. Thinking about collecting a wider and more uniform data set, at least for European stocks, on stuff such as Revenues, EBIDTA, Net Debt/EV, DPS and the sort of things that have gone into the sector data items that someone has picked out.
4. Adding more data (such as historic numbers Etc.) from an outside data source to fill in further 'back data'.
Red Box: Strategy Database
Notes on some do's and don'ts, shortcuts and 'gotchas' based on over 20 years of putting together these damn things!
Thursday 2 December 2010
Tuesday 30 November 2010
Putting Together Simple Historic Valuation Database (Avoiding Quagmires)
OK, so before ranting on too much about all the pitfalls and shortcuts of building databases, let's talk through a simple example.
I'm going to talk through a database that I built one summer for the universe of Spanish stocks back in 1996.
I've chosen this as an example, because you mentioned about 'Historic Valuation' Data as being something, which might be useful for you to discuss with analysts.
By way of a preamble, the reason for building this was that ABN Amro, who I was working for back in 1996, had acquired Alfred Berg. It then appointed management from Alfred Berg, on the basis of it being a 'best of breed' country broker to head up the whole European stockbroking operation. The news ex-Alfred Berg management then went on to appoint other ex-Berg people to manage things and I ended up with an ex-Berg boss from the US sales-side, who wanted a Berg-style database 'pronto'.
One of the features of Alfred Berg research was that it always featured a P&L, Balance Sheet, Cashflow statement AND VALUATION HISTORY, which went back longer than provided by most brokers. Whereas the standard for most brokers was 5 years of history, Alfred Berg always put ten years, which they claimed was particularly appreciated by their clients, especially they claimed US clients.
This is a 'Valuation Perspective' from one of the 60-odd companies that I knocked into the Alfred Berg format (it happens to have 9 not 10 years history because it has 4 forecast years, with 14 columns being about the limit we could squeeze on a page).
As you, I hope can see, it basically gives you are simple snapshot of 'range' data, based on High, Low and Year End data for share prices. The only investment ratio which is put against all three price points is PE.
Other investment ratios are only shown against Year-End prices, but adding full ranges is just a matter of adding the lines and a formula to the spreadsheet. Space considerations were the only thing that stopped them being included. The raw data is there should another want to calculate them.
All of the High/Low/Year End data was downloaded from datastream in a relatively simple macro, which downloaded the data across all 60 stocks in the database.THE REAL TRICK WAS TO DOWNLOAD BOTH ADJUSTED AND UNADJUSTED PRICE DATA (ie. for Corporate Actions).
In fact nearly all of this table could have been generated using 'unadjusted' price data and figures as reported at least in terms of ratios. The only line which really necessitated adjusting everything was the seemingly innocuous line '% change' in the year end shares price (ie. share price performance in the 12 month period).
Adjustments were derived from comparing the datastream adjusted and unadjusted prices, with a couple of extra lines in the spreadsheet (which are all hidden in this published form) to allow future adjustment factors to be applied within the on-going database.
The core fundamental data was taken from the following summary P&L, Balance and Cashflow which was prepared for each company:
When I say a that this was a 'relatively' easy database to set up, I should point out that this was around 8-10 weeks of around 90 hour weeks for 60 odd companies, already having a lot of the data, but I was working on my own and in an overseas office without datastream etc..
What happened to all this work?
Well like a lot of databases the work in it was not really valued. A new boss arrived, I left, he I believed deleted it all!
I console myself with the great Jimi Hendrix lyric 'Castles Made of Sand...sink into the sea...eventually'.
I'm going to talk through a database that I built one summer for the universe of Spanish stocks back in 1996.
I've chosen this as an example, because you mentioned about 'Historic Valuation' Data as being something, which might be useful for you to discuss with analysts.
By way of a preamble, the reason for building this was that ABN Amro, who I was working for back in 1996, had acquired Alfred Berg. It then appointed management from Alfred Berg, on the basis of it being a 'best of breed' country broker to head up the whole European stockbroking operation. The news ex-Alfred Berg management then went on to appoint other ex-Berg people to manage things and I ended up with an ex-Berg boss from the US sales-side, who wanted a Berg-style database 'pronto'.
One of the features of Alfred Berg research was that it always featured a P&L, Balance Sheet, Cashflow statement AND VALUATION HISTORY, which went back longer than provided by most brokers. Whereas the standard for most brokers was 5 years of history, Alfred Berg always put ten years, which they claimed was particularly appreciated by their clients, especially they claimed US clients.
This is a 'Valuation Perspective' from one of the 60-odd companies that I knocked into the Alfred Berg format (it happens to have 9 not 10 years history because it has 4 forecast years, with 14 columns being about the limit we could squeeze on a page).
As you, I hope can see, it basically gives you are simple snapshot of 'range' data, based on High, Low and Year End data for share prices. The only investment ratio which is put against all three price points is PE.
Other investment ratios are only shown against Year-End prices, but adding full ranges is just a matter of adding the lines and a formula to the spreadsheet. Space considerations were the only thing that stopped them being included. The raw data is there should another want to calculate them.
All of the High/Low/Year End data was downloaded from datastream in a relatively simple macro, which downloaded the data across all 60 stocks in the database.THE REAL TRICK WAS TO DOWNLOAD BOTH ADJUSTED AND UNADJUSTED PRICE DATA (ie. for Corporate Actions).
In fact nearly all of this table could have been generated using 'unadjusted' price data and figures as reported at least in terms of ratios. The only line which really necessitated adjusting everything was the seemingly innocuous line '% change' in the year end shares price (ie. share price performance in the 12 month period).
Adjustments were derived from comparing the datastream adjusted and unadjusted prices, with a couple of extra lines in the spreadsheet (which are all hidden in this published form) to allow future adjustment factors to be applied within the on-going database.
The core fundamental data was taken from the following summary P&L, Balance and Cashflow which was prepared for each company:
When I say a that this was a 'relatively' easy database to set up, I should point out that this was around 8-10 weeks of around 90 hour weeks for 60 odd companies, already having a lot of the data, but I was working on my own and in an overseas office without datastream etc..
What happened to all this work?
Well like a lot of databases the work in it was not really valued. A new boss arrived, I left, he I believed deleted it all!
I console myself with the great Jimi Hendrix lyric 'Castles Made of Sand...sink into the sea...eventually'.
Sunday 28 November 2010
Quagmire No 2. - Corporate Actions
My advice on Quagmire No.1 (Consistent Accounting) was not to go there.
It would be nice if that was the case with Database Quagmire No.2 - Corporate Actions....
...But I am afraid this pothole is not one which you can avoid entirely...
... The best I can say is forewarned is forearmed...
....And the best advice that I can give is to avoid as many sinkholes in this particular quagmire as you can, and with this to guide you, hopefully you will.
--------------------------------------ooooooooooooooooooo------------------------------------
The fact is that 'Corporate Actions', 'stock splits', discounted rights issues, convertibles, stock dividends, etc. etc. are actually fiendish traps DESIGNED by corporate financiers EXPLICITLY to confuse and confound investors, and in the process muck up databases.
And muck them up they do. Not least because the corporate action adjustment factors are cumulative, eg. if a company had a 1 for 10 split every year we need to keep adjusting prior years earnings numbers etc.. Add in convertibles, warrants and goodness knows what you have a stinking rats nest to untangle.
--------------------------------------oooooooooooooooooo--------------------------------------
Tip-Toeing through the Tulips
So to counter this fiendishness we first of all need to keep our hand about us and REMEMBER FIRST PRINCIPLES.
For example:
We normally thing of PE as being : Share Price/Earnings per Share
So when the Corporate Finance Goblins change the no. of shares by stock split/ discounted rights issue we could get into a real spin.
Unless we remember our 'O' Level (sorry 'GCSE' level in modern parlance) algebra, that is.
Because
EPS=Earnings per share= Total Earnings (however defined)/Total Shares
Similarly
Market Cap=Share Price x Total Shares
Therefore PE also equals
Market Cap/Total Earnings
Because
Share Price x Total Shares = Share Price
Total Earnings Earnings/Total Shares
OK, your purist analyst will say, what about dilution for convertibles, do we use average or weighted average number of shares....yada...yada...yada...
IT WILL DEPEND ON WHAT YOU ARE TRYING TO GET OUT OF THE DATABASE WHETHER THIS IS RELEVANT OR NOT.
If, for example, you were trying to build a simple database for historic valuation [which I'll discuss in a separate blog posting] based on high/low/year-end figures for share prices, it might not be necessary to do individual corporate action adjustments at all, or if you do you might be able to get them from your data source (eg. Datastream or whoever) by the simple expedient of downloading BOTH adjusted and unadjusted data for your High/Lows and YE prices.
You could find that get what you want by just working off group net profit/earnings (or whatever), year-end number of shares and unadjusted year end share price. So no need for corporate action adjustments.
Or if you are working your data a different way, getting a whole database of corporate action adjustment factors might be as simple as multiplying/dividing your downloaded unadjusted/adjusted price data and VOILA - a whole corporate actions database done for you using just a small Excel workbook of three spreadsheets!!!!
(And think smart - there around 250 trading days in a year, so if you are working off daily data your database will hold data equal to
Total Data = No. of Companies X Years in the Database X 250
But do you need this level of info. for what you are doing. Because there are only 12 months in a year, so then your database will hold data equal to:
Total Data = No. of Companies X Years in the Database X 12
It can be quite quick to knock up that sort of database).
-------------------------------------------oooooooooooooooo-------------------------------------------
Obviously 'Corporate Actions' is a much more technical subject than I can deal with in a one-page blog entry, but if an analyst comes into your room in response to a simple request for some data for a simple database, spouting about definitions of weighted average, fully-diluted, cashflow adjusted...nonsense...
1. Put Wax in your ears!
2. Remember these simple lists of do's and don'ts for handling corporate actions
DON'T Be afraid of just chucking your old Database out and starting again. It's often easier.
Just make sure you keep the 'Data', or know where to get it next time.
DO Keep all the 'Unadjusted', 'Original' Data that you can
DON'T Do more than you have to initially. 'Mission Creep' kills these projects.
DO Keep all the spreadsheets etc. with your workings. It can be a lot easier to 'tweak your data,
when you have it and want to get something different out of it.
DO Remember FIRST PRINCIPLES of whatever it is you are trying to do.
It would be nice if that was the case with Database Quagmire No.2 - Corporate Actions....
...But I am afraid this pothole is not one which you can avoid entirely...
... The best I can say is forewarned is forearmed...
....And the best advice that I can give is to avoid as many sinkholes in this particular quagmire as you can, and with this to guide you, hopefully you will.
--------------------------------------ooooooooooooooooooo------------------------------------
The fact is that 'Corporate Actions', 'stock splits', discounted rights issues, convertibles, stock dividends, etc. etc. are actually fiendish traps DESIGNED by corporate financiers EXPLICITLY to confuse and confound investors, and in the process muck up databases.
And muck them up they do. Not least because the corporate action adjustment factors are cumulative, eg. if a company had a 1 for 10 split every year we need to keep adjusting prior years earnings numbers etc.. Add in convertibles, warrants and goodness knows what you have a stinking rats nest to untangle.
--------------------------------------oooooooooooooooooo--------------------------------------
Tip-Toeing through the Tulips
So to counter this fiendishness we first of all need to keep our hand about us and REMEMBER FIRST PRINCIPLES.
For example:
We normally thing of PE as being : Share Price/Earnings per Share
So when the Corporate Finance Goblins change the no. of shares by stock split/ discounted rights issue we could get into a real spin.
Unless we remember our 'O' Level (sorry 'GCSE' level in modern parlance) algebra, that is.
Because
EPS=Earnings per share= Total Earnings (however defined)/Total Shares
Similarly
Market Cap=Share Price x Total Shares
Therefore PE also equals
Market Cap/Total Earnings
Because
Share Price x Total Shares = Share Price
Total Earnings Earnings/Total Shares
OK, your purist analyst will say, what about dilution for convertibles, do we use average or weighted average number of shares....yada...yada...yada...
IT WILL DEPEND ON WHAT YOU ARE TRYING TO GET OUT OF THE DATABASE WHETHER THIS IS RELEVANT OR NOT.
If, for example, you were trying to build a simple database for historic valuation [which I'll discuss in a separate blog posting] based on high/low/year-end figures for share prices, it might not be necessary to do individual corporate action adjustments at all, or if you do you might be able to get them from your data source (eg. Datastream or whoever) by the simple expedient of downloading BOTH adjusted and unadjusted data for your High/Lows and YE prices.
You could find that get what you want by just working off group net profit/earnings (or whatever), year-end number of shares and unadjusted year end share price. So no need for corporate action adjustments.
Or if you are working your data a different way, getting a whole database of corporate action adjustment factors might be as simple as multiplying/dividing your downloaded unadjusted/adjusted price data and VOILA - a whole corporate actions database done for you using just a small Excel workbook of three spreadsheets!!!!
(And think smart - there around 250 trading days in a year, so if you are working off daily data your database will hold data equal to
Total Data = No. of Companies X Years in the Database X 250
But do you need this level of info. for what you are doing. Because there are only 12 months in a year, so then your database will hold data equal to:
Total Data = No. of Companies X Years in the Database X 12
It can be quite quick to knock up that sort of database).
-------------------------------------------oooooooooooooooo-------------------------------------------
Obviously 'Corporate Actions' is a much more technical subject than I can deal with in a one-page blog entry, but if an analyst comes into your room in response to a simple request for some data for a simple database, spouting about definitions of weighted average, fully-diluted, cashflow adjusted...nonsense...
1. Put Wax in your ears!
2. Remember these simple lists of do's and don'ts for handling corporate actions
DON'T Be afraid of just chucking your old Database out and starting again. It's often easier.
Just make sure you keep the 'Data', or know where to get it next time.
DO Keep all the 'Unadjusted', 'Original' Data that you can
DON'T Do more than you have to initially. 'Mission Creep' kills these projects.
DO Keep all the spreadsheets etc. with your workings. It can be a lot easier to 'tweak your data,
when you have it and want to get something different out of it.
DO Remember FIRST PRINCIPLES of whatever it is you are trying to do.
An Amazing source for data
The ‘Between the Hedges’ website is a very useful resource for all sorts of data.
The main blog itself is a series of summaries of top new stories from various providers (Bloomberg, Reuters, CNBC, WSJ, FT and eight other sources) three times a day, a daily ‘Bull Radar’ and ‘Bear Radar’ highlighting main trends/(US) stocks, a news and earnings preview and a (US) market comment.
The real value of this blog, however, is in the ‘links’ section, which cover 27 categories and several hundred links. Which can be found here:
http://hedgefundmgr.blogspot.com/2010/03/links-of-note.html
The categories are:
Global News
U.S. News
Video News
Terrorism/War
Media/Political Watchdogs
Financial News
Financial Portals
Financial Commentary
I-Banks
Economic Portals
Economic Commentary
Central Bank Notes
Market Readings
Trader's Corner
Calendars/Schedules
Sentiment/Indicators
Commodities/Futures
Trading Portals
Sector Work
Trade Journals/Publications
Screens and Scans
Quotes
Stock-Specific Research
Charts of Interest
Hedge Fund Information
Sites of Interest
Blogs of Note
All manner of otherwise hard to find information and charts can be found within these links from Put/Call Ratios and Short Interest, through Commodity prices to DRAM prices and price charts, and gems such as Morgan Stanley’s bond market report, the World Gold Council’s Research and Statistics, and RigZone.com, a trade journal for the oil rig hire industry.
In short, a real gold mine of info!
The categories are:
Global News
U.S. News
Video News
Terrorism/War
Media/Political Watchdogs
Financial News
Financial Portals
Financial Commentary
I-Banks
Economic Portals
Economic Commentary
Central Bank Notes
Market Readings
Trader's Corner
Calendars/Schedules
Sentiment/Indicators
Commodities/Futures
Trading Portals
Sector Work
Trade Journals/Publications
Screens and Scans
Quotes
Stock-Specific Research
Charts of Interest
Hedge Fund Information
Sites of Interest
Blogs of Note
All manner of otherwise hard to find information and charts can be found within these links from Put/Call Ratios and Short Interest, through Commodity prices to DRAM prices and price charts, and gems such as Morgan Stanley’s bond market report, the World Gold Council’s Research and Statistics, and RigZone.com, a trade journal for the oil rig hire industry.
In short, a real gold mine of info!
Quagmire No 1: A quagmire you'll never get out of.......Consistent accounting!
Oh what a 'unicorn' this one is! A chase for a rare beast... you never seem to be able to catch!
Analysts with an accounting background everywhere will rubbish whatever database you build...because it isn't ...consistent...the 'c' word...
... So if only you could get a database, based on CONSISTENT accounting, then it would be a great database...but otherwise ...rubbish...
Now here's a piece of advice. Ignore this. Accounts are NEVER consistent. They aren't consistent across regions/countries...they are consistent across sectors...AND THEY AREN'T EVEN CONSISTENT FOR THE SAME COMPANY FOR MORE THAN A COUPLE OF YEARS AT A TIME!!!!!
If you have been plugging numbers into spreadsheets/databases on as many different companies and types of companies for as long as I have you will discover an unpleasant truth... accounting is just NEVER consistent. While there is the odd exception, the fact is even the same company changes its accounting ON AVERAGE every THREE years.
Here's what generally happens. Company A thinks a new biz might be interesting. So it starts the investment off as an 'associate' or an 'investment'. So it doesn't consolidate sales or much anything else...
....if the new biz is a flop, it never makes it into the 'group' P&L, Balance Sheet or Cashflow...
...if it works out, it progress from 'investment' to 'associate' and if it grows well enough is then consolidated into the group, and could end up BECOMING the group...
...Remember NOKIA did not start out as a mobile phone company, GM and GEs financing businesses were little investments which have ended up being bigger than the core businesses...
So let's just turn this completely around...
...never build a database EXPECTING your data to be consistent
....anticipate and expect changing accounts
... DO NOT throw the baby out with the bathwater...gather the data...gather as much as you can including keeping BOTH the original AND the RESTATED data to make sense of after...
...ORGANISE THE DATA to get what YOU want out of it, as and when it is relevant
... Prepare yourself for an imperfect world, consider consistent data to be a rare gem not a given...
THE ONLY WAY ALL YOUR ACCOUNTING DATA IS GOING TO BE CONSISTENT IS IF YOU EMPLOY A SINGLE ACCOUNTANT WITH ACCESS TO THE MANAGEMENT/INTERNAL ACCOUNTS OF EVERY SINGLE COMPANY IN THE DATABASE - AND THAT AIN'T GONNA HAPPEN!!!
These comments apply mainly to company account...but any data series that you want to measure...government stats, oil prices whatever...aren't that different.
Unicorn catching is for 'virgins'. With five kids between us, you and me are long passed that, so forget it!
Analysts with an accounting background everywhere will rubbish whatever database you build...because it isn't ...consistent...the 'c' word...
... So if only you could get a database, based on CONSISTENT accounting, then it would be a great database...but otherwise ...rubbish...
Now here's a piece of advice. Ignore this. Accounts are NEVER consistent. They aren't consistent across regions/countries...they are consistent across sectors...AND THEY AREN'T EVEN CONSISTENT FOR THE SAME COMPANY FOR MORE THAN A COUPLE OF YEARS AT A TIME!!!!!
If you have been plugging numbers into spreadsheets/databases on as many different companies and types of companies for as long as I have you will discover an unpleasant truth... accounting is just NEVER consistent. While there is the odd exception, the fact is even the same company changes its accounting ON AVERAGE every THREE years.
Here's what generally happens. Company A thinks a new biz might be interesting. So it starts the investment off as an 'associate' or an 'investment'. So it doesn't consolidate sales or much anything else...
....if the new biz is a flop, it never makes it into the 'group' P&L, Balance Sheet or Cashflow...
...if it works out, it progress from 'investment' to 'associate' and if it grows well enough is then consolidated into the group, and could end up BECOMING the group...
...Remember NOKIA did not start out as a mobile phone company, GM and GEs financing businesses were little investments which have ended up being bigger than the core businesses...
So let's just turn this completely around...
...never build a database EXPECTING your data to be consistent
....anticipate and expect changing accounts
... DO NOT throw the baby out with the bathwater...gather the data...gather as much as you can including keeping BOTH the original AND the RESTATED data to make sense of after...
...ORGANISE THE DATA to get what YOU want out of it, as and when it is relevant
... Prepare yourself for an imperfect world, consider consistent data to be a rare gem not a given...
THE ONLY WAY ALL YOUR ACCOUNTING DATA IS GOING TO BE CONSISTENT IS IF YOU EMPLOY A SINGLE ACCOUNTANT WITH ACCESS TO THE MANAGEMENT/INTERNAL ACCOUNTS OF EVERY SINGLE COMPANY IN THE DATABASE - AND THAT AIN'T GONNA HAPPEN!!!
These comments apply mainly to company account...but any data series that you want to measure...government stats, oil prices whatever...aren't that different.
Unicorn catching is for 'virgins'. With five kids between us, you and me are long passed that, so forget it!
About the 'Strategy Database' Blog
'Research' does not become 'analysis' until you add numbers!
The better your database the quicker you can analyse, and the better set up your database is the quicker you can turn your analysis into published form - whether as presentation pack or document.
The problem is you can end up spending more time getting the database to work than actually using it!
I first got dragged into the quagmire of databases at SBC where I managed to turn a very nice database which had been bought for a large European department (which then largely left!) into a very useful tool for our team. I sank waist deep into databases at UBS, which had ever spiraling aspirations of what to do with them but never invested anything in technology, and ended up at SG sorting out a very large and expensive one and getting that to work, too!
I've built big databases and I've built small ones. I've used ones which took years to do, and built ones almost as useful in a couple of afternoons....I've and even coped without them altogether!
A good database can increase productivity enormously. A bad one will drag you down.
I hope the suggestions here help you avoid the worst traps of the quagmire, and help you spend as little time struggling in the quagmire of their construction as possible, and as more time using something useful.
The better your database the quicker you can analyse, and the better set up your database is the quicker you can turn your analysis into published form - whether as presentation pack or document.
The problem is you can end up spending more time getting the database to work than actually using it!
I first got dragged into the quagmire of databases at SBC where I managed to turn a very nice database which had been bought for a large European department (which then largely left!) into a very useful tool for our team. I sank waist deep into databases at UBS, which had ever spiraling aspirations of what to do with them but never invested anything in technology, and ended up at SG sorting out a very large and expensive one and getting that to work, too!
I've built big databases and I've built small ones. I've used ones which took years to do, and built ones almost as useful in a couple of afternoons....I've and even coped without them altogether!
A good database can increase productivity enormously. A bad one will drag you down.
I hope the suggestions here help you avoid the worst traps of the quagmire, and help you spend as little time struggling in the quagmire of their construction as possible, and as more time using something useful.
Subscribe to:
Posts (Atom)