Tuesday, March 30, 2010

Can Leaflet Distribution Really Keep The Economy Afloat

Advertising is everywhere, from product placement on television, to brand awareness campaigns out on the streets. The marketing industry is one of the major industries responsible for employment within the UK and has a diverse range of careers within its structure. To understand the value of marketing on employment levels, it is necessary to understand the effect of marketing on trading practices.

Leaflet distribution, for example, is a form of marketing that utilises the skills and services across a broad range. The need for leaflet distribution lies in the necessity to advertise an event, a product or a service to a target audience. For the company or business, although they know the demographic that they are aiming for, they are not equipped to get the message to them, and therefore enlist the services of marketing professionals.

Door to door leaflet distribution cannot differentiate between which homes will be interested in a product, and therefore, must compete for attention once through the letterbox. To do this, the design of the leaflet must be appropriate to the product and enticing to the customer. For maximum results, the correct imagery, colours, font and information would have been chosen to make the product or service desirable to the potential customer.

To ensure this has been done to the greatest effect, the marketing professionals will enlist the advice of graphic designers. This ensures the concept is translated into a visual form that can sell with success; a sleek design with high quality graphics has a good chance of competing against other high end marketing messages, and a few versions of an idea will be sent to the marketing agency for approval.

Once the designs have been chosen, the text will be added using tricks of the trade that are likely to get a customer interested. Next the leaflet design has to be produced in a hard copy format, at which point the services of a printing services professional is needed. High volumes of high quality leaflets require specialist services that use the latest printers available on the market. Once the leaflets have been printed up, they require distribution.

This has two stages; firstly, the consignment of leaflets that have been printed need delivering to a distribution centre to then be split up into smaller units and taken to the relevant areas. The decisions on where to target the marketing is taken by a distribution specialist based on parameters such as parish boundaries, postcodes and constituencies among many other options.

Crystal Reports: 5 Tests for Top Performance

It is complete, your masterpiece report. Not only does it meet your customer's expectations, it blows them out the water, all they want is beautifully summarised and displayed in a myriad of ways.

Then….

Disaster!

You try to run the report for a month against the live database and not the two days test data you used for development.

Suddenly your report's runtime goes from twenty seconds to two hours.

Every Crystal Reports developer has experienced this situation and it can be one of the most frustrating aspects of report design.

Thankfully there are a variety of things that can be done to combat bad performance, any one of which can reap huge benefits.

Here are the five most likely causes of poor performance and how to mitigate their effects.

1. The Database Set Up.

This may or may not be within your direct control to alter, but databases are not set up ideally.

Two top contenders are:

a. The fields you are filtering on are not indexed. You can check whether or not this is the case by referring to the Linking Tab in the Database Expert window. Indexed fields have colored markers next to them.

I have personally seen reports run hundreds of times quicker due to the addition of an index being added to an important (to the report filter) field.

b. Using a view rather than a table to report from can be devastating to a report's performance. This is mainly due to views not having indexes. A view is a collection of tables (much like a basic report) and is often used to simplify data for end users.

The only way to avoid this is to report on the tables which make up the view. Identifying whether the source of a field is a table or a view can be done via the Database Expert as tables and views are listed separately.

Identifying which tables make up a view can be much trickier and you may need the help of the database documentation.

Also, when using Oracle databases, turning off the case sensitive option on queries can really speed up reporting times but may require existing reports to be rewritten.

2. Using the wrong ODBC driver.

ODBC drivers are how Crystal Reports attaches to the database. There is usually a variety of ODBC drivers which will work for any particular make of database and some are better than others.

The only way to really test this is to run the report with all the suitable ODBC drivers and see which is the most efficient.

Experience has taught me that the ODBC driver provided with the software associated to the database is usually the best option.

3. Excessive Use of Sub Reports

Each sub report is like another report accessing the database, and if that sub report is placed in the Detail Section it will run for EVERY record the main report loads. Even if placed in a Group Section the sub report will still be run numerous times.

Report Sections are usually the ideal place to home a sub report as they will only run once. But this still turns one report into two as far as performance is concerned.

The best way to negate the performance issue caused by sub reports is to not use them.

Ninety nine percent of sub reports are not necessary and the same result can be achieved using other methods through grouping, running totals and / or formulas.

4. Table Linking

Anything other than a Link Type of equals (‘=') will cause a massive degradation in performance.

The Link Options window (accessible through right clicking on a specific link) will allow any values to be reset.

If there is a need for this time of link, the same result can be achieved through the Group Selection or through formatting (and hiding the unwanted records) once they are loaded into the report.

5. Record Selection

When code for the record selection is written correctly, Crystal Reports will pass all the logic to the database as SQL and only return the data needed.

If the record selection is not written in an SQL friendly way, Crystal Reports will bring back all the data and then filter it locally. This can be drastically slower than when calculated on the database.

Using the Record Selector Expert will guarantee that any filter created will be evaluated on the database and be as efficient as possible.

An additional point which can make a difference in some cases is when the report is scheduled to run. Heavy network traffic or database usage can impact a report's running time.

Working through the above points will enhance the efficiency of your slowly running reports. Building your reports with all this in mind from the beginning will save you redevelopment time later.

How to sell your boss on the power of analytics

One would be hard-pressed to recall any major business intelligence and analytics conferences where keynote speakers and breakout presenters were not asked – frequently, repeatedly and quasi-desperately:

* How do I sell BI/analytics projects to my senior managers?
* How do I get executive buy-in?
* How do I calculate the ROI on the projects I am proposing?

Researchers at the IT Leadership Academy were fascinated by the paradox of high-payback, reasonable-risk BI/analytics projects that were put on the back burner in many otherwise thought-to-be-clever enterprises. We undertook a quick analysis in an attempt to understand what is going on here.
the food chain, while MySQL (which is even slated for SAP certification in two to three years, or maybe less) nips at Microsoft's heels.
Even more important, there's been an explosion in ultracheap OLAP technologies, both in-memory and in appliance formats. Most of these have very simple indexing schemes -- some have no indexes at all -- which yields huge TCO advantages in storage costs and administrative overhead alike.
The opportunity provided by these fledgling technologies might seem balanced by obvious risks. But before long, embracing them will be the only viable choice. The primary reason is schema explosion, on multiple fronts.
First, there's an explosion in profiles. CRM customer profiles (ideally with full Web site click-trail data), vendor profiles, security-oriented user profiles, you name it -- in almost all cases, the available information, and types of information, vary from one profilee to the next. Mobile/pervasive devices just worsen the problem, adding complexity in terms of location, availability and form factor. Centralized, pre-DBMS2 master data management will never succeed.
Second, text documents are becoming an ever bigger part of IT, be they complex forms and contracts, maintenance manuals, health records, Web marketing content or just e-mail. Documents are commonly unpredictable in structures and sometimes in authoring and editing metadata as well. And the ultimate solutions to making text search work will depend on further schema extension and variability, in a number of respects.

Time for a New View of Data Management

Database management is in a crisis, one that's only partly recognized. The horrors of data integration may be well known, but they're only the tip of a much larger iceberg: schema complexity. Programmers, system architects, and database administrators focusing on design and operation alike -- all their jobs are made immeasurably harder by the boggling complexity of relational schemas.
As schema diversity explodes, the pure relational model is collapsing under its own weight. We must replace it with a radically different view of data management, which I'm calling DBMS2, for database management system services. The key aspects of DBMS2 include the following:
• Task-appropriate data managers. Just use whatever is cheapest and simplest for each set of applications. Possible choices include but are not limited to cheap online transaction processing DBMSs, high-end OLTP DBMSs, data warehouse appliances, XML-based document stores, highly distributed and/or small-footprint DBMSs, in-memory systems without their own persistent storage, or cross-corpus indexers without their own storage.
• Drastic limitations on relational schema complexity. Relational schemas shouldn't go far beyond two simple models: master-detail for transactions, and hypercubes/star schemas for analytics. Anything inherently more complex is, with rare exceptions, better handled via the schema flexibility of XML. If you need to access data from a legacy application that violates these precepts, do so via XML-based Web services.
• Both XML-based and relational information integration. Eventually, most DBMS2 data integration will be done via XML. But relational enterprise information integration will long have a role to play, such as connecting core OLTP and data warehouse systems.
DBMS2 is the antithesis of much current database theory. Rather than fighting modularity, DBMS2 embraces it. Rather than gathering administrative tasks in one huge hairball, it spreads them across many simple systems. Above all, unlike the Oracle pipe dream of a grand unified enterprise relational database, DBMS2 is a pragmatic, realistic continuation of what every large enterprise is doing today.
The need and opportunity for DBMS2 are driven by two overlapping trends: platform change and schema explosion. For starters, DBMS2 depends on the increasing availability of XML and Web services technology. It will be years before XML-based data-manipulation languages are sufficiently robust to handle the requirements of DBMS2, but those developments will happen, and most big software vendors will provide strong support for them in a timely manner.
Beyond that, one of the biggest reasons for embracing DBMS2 is a flood of low-cost alternatives to traditional DBMSs. For most enterprises, relational OLTP is approaching commodity status. Microsoft SQL Server is following Oracle up