Tuesday 30 December 2008

Benchmarking - what's it all about?


Benchmarking is the process of comparing the cost, time or quality of what one organization does against what another organization does. The result is often a business case for making changes in order to make improvements. Also referred to as "best practice benchmarking" or "process benchmarking", it is a process used in management where organizations evaluate various aspects of their processes in relation to best practice, usually within their own sector. This then allows organizations to develop plans on how to make improvements or adopt best practice, with the aim of increasing performance.

The most prominent methodology is the 12 stage approach by Robert Camp (who literally wrote the book on benchmarking in 1989). It consists of;
1. Select subject 2. Define the process 3. Identify potential partners 4. Identify data sources 5. Collect data and select partners 6. Determine the gap 7. Establish process differences 8. Target future performance 9. Communicate 10. Adjust goal 11. Implement 12. Review / recalibrate.

Benchmarking can take various guises:
Process benchmarking - a firm focuses its observation and investigation of business processes with a goal of identifying and observing the best practices from one or more benchmark firms. Activity analysis will be required where the objective is to benchmark cost and efficiency.

Financial benchmarking - a company performs a financial analysis and compares the results in an effort to assess overall competitiveness.

Performance benchmarking - allows a firm to assess their competitive position by comparing products and services with those of target firms.

Product benchmarking - the process of designing new products or upgrades to current ones. This process can sometimes involve reverse engineering competitors’ products to find strengths and weaknesses.

Strategic benchmarking - involves observing how others compete. This type is usually not industry specific, meaning it is best to look at other industries.

Functional benchmarking - a company will focus its benchmarking on a single function in order to improve the operation of that particular function, i.e. Human Resources, Finance and ICT.

Internal benchmarking - involves benchmarking businesses or operations from within the same organisation (e.g. business units in different countries).

External benchmarking - analysing outside organisations that are known to be best in class provides opportunities of learning from those who are at the ‘leading edge’.

International benchmarking - best practitioners are identified and analysed elsewhere in the world; globalisation and advances in information technology are increasing opportunities for international projects.

Benchmarking involves four key steps:
1) Understand in detail existing business processes
2) Analyse the business processes of others
3) Compare own business performance with that of others
4) Implement the steps necessary to close the performance gap

Benchmarking should not be considered a one-off exercise. To be effective, it must become an ongoing, integral part of an ongoing improvement process with the goal of keeping abreast of ever-improving best practice.

Why Bother?
There are many benefits of benchmarking; the following list summarises the main ones:
· provides realistic and achievable targets
· prevents companies from being industry led
· challenges operational complacency
· encourages continuous improvement
· allows employees to visualise improvement which can be a strong motivator for change
· creates a sense of urgency for improvement
· confirms the belief that there is a need for change
· helps to identify weak areas and indicates what needs to be done to improve.

So how does all this apply to us - what are the benefits of benchmarking CAD performance?
· Gain visibility of core CAD skills
· Identify individuals’ strengths & weaknesses
· Implement better CAD training plans for staff
· Improve CAD recruitment processes
· Share performance data across an organisation
· Promote collaborative working between teams & offices
· Measure the performance of outsourcing or off-shoring partners
· Provide ‘best practice’ for CAD development
· Offer clearer staff inductions and more meaningful staff appraisals
· Enjoy better skills resourcing for projects
· Develop a continuous improvement process for CAD
· Save time and money on construction projects and offer best value for clients

In the end, it all comes down to better visibility of your teams and their real ability to use often complex technology and tools for maximum effect. If you can't measure it, you can't manage it.
Happy New Year!
Rory

Thursday 11 December 2008

CAD skills testing job applicants in the USA - is it allowed?

Is it OK to test the basic CAD skills of prospective employees in the USA? This is a question which seems to come up quite frequently from our friends in the US. A common conundrum surrounding CAD skills assessments is whether it is legally allowed to screen the CAD ability of prospective employees and contract staff, before they join your team.

The short answer is yes, in our opinion, but only under the right circumstances. Let’s examine some of the evidence.

The most important issue underlying the use of pre-employment assessments is validity. The question we need to ask is this; ‘Is the test valid for this intended purpose; does it support the decisions that are going to be made?’

So what is 'validity'? Validity measures how appropriate a test is for a specific purpose. Simply put, a test may be considered valid for one use and invalid for another. Why do pre-employment tests need to be validated? In 1978, the Equal Employment Opportunities Commission (EEOC) created guidelines to ensure that the knowledge gained from testing is applied with impartiality to protect minority applicants from discriminatory employment procedures. What is the best method of validation? The EEOC guidelines do not state that one method is better than another; the method used must fit the needs of the business or organization.

There are three methods of validation set forth by the EEOC:

1) Criterion Validity: If data demonstrates that a test is significantly correlated with a vital measure of job performance, the test is said to demonstrate criterion validity. For example, if all the CAD users that scored highly on a selected test to measure CAD skills completed their projects accurately and on time, the test would demonstrate criterion validity.

2) Construct Validity: The term construct is a technical term for personality traits like intelligence and creativity. Construct validity is demonstrated if a test measures traits that have been found to influence successful performance of a job. A test that measures the interpersonal communication skills of a potential customer service representative would demonstrate construct validity.

3) Content Validity: This is demonstrated if the questions that make up an assessment are representative of content that is required to perform a particular activity or task. A test made up of algebra questions given to an applicant for a maths teacher's position would demonstrate content validity.

Many employers use employment tests and other selection procedures in making employment decisions. Examples of these tools, many of which can be administered online, include the following:

- Cognitive tests assess reasoning, memory, perceptual speed and accuracy, and skills in arithmetic and reading comprehension, as well as knowledge of a particular function or job;
- Physical ability tests measure the physical ability to perform a particular task or the strength of specific muscle groups, as well as strength and stamina in general;
- Sample job tasks (e.g. performance tests, simulations, work samples, and realistic job previews) assess performance and aptitude on particular tasks. NB CAD skills assessments fall into this category;
- Medical inquiries and physical examinations, including psychological tests, assess physical or mental health;
- Personality tests and integrity tests assess the degree to which a person has certain traits or dispositions (e.g. dependability, cooperativeness, safety) or aim to predict the likelihood that a person will engage in certain conduct (e.g. theft, absenteeism);
- Criminal background checks provide information on arrest and conviction history;
- Credit checks provide information on credit and financial history;
- Performance appraisals reflect a supervisor’s assessment of an individual’s performance; and
- English proficiency tests determine English fluency.

An important item to remember when interpreting CAD assessment scores is to put the results in context, and to compare them to an external performance benchmark. For example, a 58% score does not reflect ‘failure’. If this score is presented by a CAD user against an office average of 60%, it is likely the candidate will comfortably fit in to the team.

Test questions should also have a recognizable skill level; basic, intermediate or advanced. In this way, the benchmark data can be reliably used to compare CAD performance and make sound hiring decisions.

In conclusion, for a CAD skills test to be valid, it must contain content that reflects a representative sample of the target skill. A reliable CAD skills test should comprise sample job tasks (e.g. performance tests, simulations, work samples, and realistic job previews) and assess performance and aptitude on particular tasks.

Skills evaluations should not discriminate according to; race, color, national origin, sex, religion, age (40 or older), or disability.

Lastly, employers should ensure that employment tests and other selection procedures are properly validated for the positions and purposes for which they are used. The test or selection procedure must be job-related and its results appropriate for the employer's purpose.

More information can be found here:
http://www.uniformguidelines.com/uniformguidelines.html and
http://www.eeoc.gov/policy/docs/factemployment_procedures.html

Rory

Tuesday 9 December 2008

The View from the Strip...


Well, we're back and just about caught up from our trip to AU2008 in Las Vegas, and I must say, overall we had a great time! This was my first time at AU as both sponsor and speaker - and it certainly made for a busy event!

The logistics of the conference are a sight to behold; even in an economic downturn, there must have been more than 8,000 delegates in attendance. Simply feeding and watering such a horde of enthusiastic CAD aficianado's was an impressive logistical feat!

The venue itself has to be seen to be believed; we stayed at the Venetian Resort, which is enormous! The walk from our suite ('room' doesn't do it justice!) to the exhibition hall took about 15 minutes! And yes, I did forget my laptop and had to make the trek back at least once! The attention to the interior detail was fascinating, with faux tapestries, ceiling montages, and even a frighteningly realistic blue sky above the shopping mall. It took me a moment to realise that I was still inside the building; and when we sat at the bar for a quick afternoon drink, my wife asked the waiter if we could sit 'outside'! :) I could tell from his wry smile we weren't the first! We then sat over a beer, to be entertained by a mini-opera just yards away!

There were a couple of moments which might have detracted from an otherwise thoroughly enjoyable week. When setting up the booth, a rather over-zealous security guard insisted we have our name badges re-printed because they were missing a label (which meant queueing again for 30 mins). And the Autodesk 'help' committee tried to charge me $25 for a replacement badge when I left it in my wife's handbag (she was shopping, naturally!) - even though I tried to explain we were spending $5k on a booth! Ah well, I guess they were just following the rules.

From a business perspective, I'd say the show was a resounding success. This year, CADsmart was involved in a new format for the AUGI Top Daug competition, which went remarkably smoothly. Over 500 assessments taken over two days, on 32 machines, and just two software crashes - a credit to Ed and Martin on our technical tem, for building such a stable app. The results were posted live to a 'Top 10' scoreboard throughout the event, and the eventual winner was determined by a drafting 'shootout'. Great stuff!

We also launched our new Revit skills assessment, which was tremendously well received. Over 100 companies signed up to trial the software, so we're looking forward to receiving their feedback in the coming days and weeks. We'll be creating an Imperial data set in the next few weeks, for those firms who refuse to work in the real world :) (just kidding!) and will be adding Structure and MEP content in the new year.

Tony & I delivered our class on making the move from AutoCAD to Revit to a full room, and received some good scores on our feedback survey, which is nice to hear.

And so to the gambling; we did have a dabble on the last day, just to be polite! Tony, Mel and I started with a 'pot' of $25 each and here's how we did; Tony - lost it all on a spin of the wheel. Myself - ditto (although I went red 18 and it came up red 19 - so near and yet so far - probably 6 words which sum up the history of Vegas!). Mel - hit the craps table, feigning an innocent ignorance, and walked away with $150, to the amusement of the croupier, who knew a hustler when he saw one! :)

And so, we left Vegas as winners, in more than one sense. Next year's event will be at the Mandalay Bay. Think I'll try my hand at Black Jack...

Rory

Wednesday 3 December 2008

Viva Las Bristol

While my esteemed colleagues Rory and Tony attend Autodesk University in Vegas I'm staying at the ranch making sure everything carries on working tickety-boo. And there's lots going on - our Revit release is now out of Beta (thanks to all the testers!), we launched our new product-focused website (see below) yesterday, and we're also involved in the Top Daug competition at the aforementioned AU, which saw 235 assessments in its first 3 hours of availability last night, and seemingly zero support issues.



Martin and I will be on tenterhooks tonight though, as apparently the Top Daug machines are going to be still available during the 'Beer Bust', which basically means (for any non-Americans reading this)  drinking lots and larking about. Oh, and networking of course.

The thought of hundreds inebriated people attempting [to break?] our bespoke pared-down version of CADsmart sends a shiver down my spine, as its a million miles away from CADsmart's intended context - a calm, formal, professional, sober atmosphere in which to be assessed. Either way I'll probably be staying up till 4am waiting for an email or two!

If British universities are anything to go by then Rory and Tony will come back from AU with huge debts having learnt nothing, however I suspect Universities are held in higher regard across the pond; I'm sure Autodesk will put on a great gig and R & T will have a great time, meet some great people and learn lots ot new stuff.

OK I'm jealous really. But 22 hours of travelling?

Ed