Tag Archives: @Legal_Recruit

Where did the ‘situational judgement test’ come from, and where is it going to?


There has been a small explosion in the research done into ‘situational judgement tests’ (SJT) for employment selection (Weekley and Ployhart, 2006). SJTs present applicants with work-related situations and possible responses to the situations. There are broadly two types of instructions (reviewed by McDaniel et al., 2007). Behavioural tendency instructions ask respondents to identify how they would likely behave in a given situation. Knowledge instructions ask respondents to evaluate the effectiveness of possible responses to a given situation. Tests assessing an individual’s judgement concerning work-related situations have had a long history in the psychological assessment literature (McDaniel et al., 2001). For example, during World War II, Army psychologists attempted to assess the judgement of soldiers (Northrup, 1989). These judgement tests consisted of scenarios with a number of alternative scenarios. Solutions rested on the person’s ability to draw on his common sense, experience, and general knowledge, rather than logical reasoning.


In an influential study by Chan and Schmitt (2002), data from 160 civil service employees were analysed, with a view to demonstrating the validity of the SJT in predicting overall job performance as well as three performance dimensions: task performance (core technical proficiency; problem analysis, written communication, oral communication), motivational contextual performance (job dedication; motivation to perform, motivation to learn, motivation to work hard), and interpersonal contextual performance (interpersonal facilitation; conflict resolution, negotiation, teamwork and co-operation). Chan and Schmitt (2002) also felt that situational judgement tests provided incremental validity over prediction from cognitive ability, personality traits, and job experience.


Previously, Huffcuttt and colleagues (2001) had attempted to elucidate the most suitable construct categories. Their constructs included mental capability, knowledge and skills, basic personality traits, applied social skills, interests and preferences, organizational fit, and physical attributes.Recently, Christian, Edwards and Bradley (2010) argue that many studies have failed, however, to report the constructs actually measured in SJTs. A construct-level focus in the situational judgement test literature is therefore lacking. Christian and colleagues (2001) found that situational judgement tests most often assess leadership and interpersonal skills and those situational judgement tests measuring teamwork skills and leadership skills have relatively high validities for overall performance.


There has been an increasing drive to standardising these tasks. For example, the LR SJT has questions in written format. There is inevitably a difference for oral questions, or SJTs in video format (it is claimed that video SJTs have more subtle nuances involving social cognition or emotional intelligence which can be picked upon). Also, the questions can potentially vary in scenario length (longer scenarios tend to have more detail), and how scenarios are profession-specific (for example, type of firm, law/medicine/business). Finally, the questions can vary in format. In the LR SJT, and for example in the 2010 and 2011 Clifford Chance SJT, you have to pick one best out of the options given. In some tests, rather, you may be required to rank your choices in order of preference.




Chan, D, Schmitt, N. (2002) Situational judgment and job performance. Human performance, 15(3), 233-254.


Christian, MS, Edwards, BD, Bradley, JC. Situational judgment tests: constructs assessed and a meta-analysis of their criterion-based constructs. (2010) Personnel Psychology 63: 83-117.


Huffcutt, AI, Conway, JM, ROTH, PL, Stone, NJ.. (2001) Identification and meta analytic assessment of psychological constructs measured in employment interviews. Journal of Applied Psychology 80: 897-913.


McDaniel, MA, Hartman, NS, Whetzel, DL, Lee Grubb III, W. (2007) Situational judgment tests, response instructions, and validity: a meta-analysis. Personnel Psychology 60, 63-91.


McDaniel, MA, Morgeson, FP, Finnegan, EB, Campio, MA, Braverman, EP. (2001) Use of situational judgment tests to predict job performance: a clarification of the literature. Journal of Applied Pyschology 86(4), 730-740.


Weekley, JA, Polyhart, RE. (2006) Situational judgment tests: Theory, management, and application. Mahwah, NK: Erlbaum.

Tagged , , ,

Low end disruptive innovation in psychometric testing for students with disabilities

The new ‘Legal Recruit’ site (twitter thread), to launch on 1 November 2011, offers law students a chance to get better at the verbal reasoning test of the sort they might do if they have to a @SHLGROUP test as part of an application for a training contract or vacation scheme placement. The only way to practice these tests properly is to try out the practice assessment which SHL Direct have themselves set.

As a student with a visual impairment myself (I have problems visualising text because of significant diplopia), I wished to design a platform so that students, including students with visual impairments, can easily practice verbal reasoning tests. I’d like law students who are applying through these SHL tests to get good at them. The online practice platform I’ve designed is an example of disruptive technology; I am currently studying innovation management as part of my special electives on my MBA at BPP Business School (as a full-time student).

Disruptive technologies are not always disruptive to customers, and often take a long time before they are significantly disruptive to established companies. I have no intention of making my platform ‘disruptive’ to SHL. In fact, I want students, including people like me with disabilities, to perform well or even shine at these tests when they’re applying for the corporate firms that use them. This unfortunately goes along with the idea popular amongst disabled trainees and disabled seniors that often that you have ‘to be better than the competition’, just to get to play on the “equal playing field”, sad but true, if true.

Disruptive technologies are often difficult to recognise. Indeed, as Professor Clayton Christensen points out and studies have shown (Clayton is Chair in Business Administration at the Harvard Business School, it is often entirely rational for incumbent companies to ignore disruptive innovations, since they can compare so badly with existing technologies or products, and the deceptively small market available for a disruptive innovation is often very small compared to the market for the established technology. SHL is a well-established provider of psychometric tests in the “grown-up world” of legal recruitment; my platform is run to help students who wish to succeed to the best of their ability in an incredibly competitive marketplace.

Disruptive technologies like my psychometric testing platform, too, can be subtly disruptive, rather than prominently so. Previous examples include digital photography (the sharp decline in consumer demand for common 35 mm print film resulting from the popularity of memory cards). In fact, my online assessment platform contains many of the features that work about the SHL verbal reasoning tests (such as a chance to know how much time has elapsed etc.). The subtle disruption I’m introducing is a simple button where you can enlarge the size of text, a simple innovation, but one which can make a huge difference to the wellbeing or happiness of students doing such tests. The point about my test is that the student can now concentrate on getting the answers to the verbal reasoning problems correct, not concentrate on surviving reading the text with enormous difficulty in the time limit.

For example, here’s the sample question for the learner without reasonable adjustments.

Now, here’s the sample question for the learner with reasonable adjustments for visual impairments, in accordance with the Equality Act 2010.

I hope that students with visual impairments, dyslexia or dyspraxia know that they can “ask for reasonable adjustments”. If legal recruiters fail to make such reasonable adjustments on the production of appropriate evidence, they may indeed be acting unlawfully under equality discrimination here in the UK. I believe my innovation is actually a low end disruption.  Arguably, Spotify is an example of a ‘low end disruption’ (see for example here). This is a powerful concept in innovation management, described here for Spotify:

At first, a disruptive product fails to deliver a superior offering to the incumbent technology in one or more characteristics of the job-to-be-done. But consumers switch nonetheless because the disruptor has a systemic advantage in at least one of these characteristics. We gave up minicomputer performance for the cost advantage of PCs, we gave up plasma television contrast for the slimness of the LCD, and we gave up the personality of written letters for the speed of emails.

Tagged , , , , , , , , , , , , ,

The new @Legal_Recruit verbal reasoning practice assessment for law students

The @Legal_Recruit system (which will be available here) is a very attractive easy-to-use cloud-based service which will allow @Legal_Recruit learners to complete sample tests, under real assessment conditions.

It will be available on Monday 10 October 2011 for the first time.

Current law students, who are doing the GDL, LPC, LLB(Hons) or LLM, especially those who are seeking training contracts or vacation placements for 2013/4/5 being made available in the next academic year may find this new service/product useful. It will be available on the internet via a secure website, and will cost £7.50 for unrestricted lifetime use. All Legal Recruit learners will have their own secure website username and password, and be invited to participate in the development of the huge bank of validated questions. These questions are set in a fair way, with due attention to equality, diversity and culture.

This product has been built because it is felt by many that law students,  the staff of their colleagues/universities (including their academics and their career services) and corporate law recruiting managers that the pivotal importance of the verbal reasoning test is grossly underestimated. This is not sensible, given the intense effort needed to complete any qualification in law. However, if your performance in a verbal reasoning test, and you fail to meet the cut-off score, it is possible that you will not be invited for interview, despite having a II.1 or above. This is clearly a tragedy.

Assessments will consist of 30 questions, containing 15 passages (2 questions per passage). The 15 passages will be selected at random by the Legal_Recruit system from a huge database consisting of an equal number of questions in the following 16 subject areas.

  • Biology
  • Business
  • Economics
  • Education
  • Engineering
  • Environment
  • Geography
  • Geology
  • Health and Safety
  • Human resources
  • Medicine
  • Modern Languages
  • Physics
  • Technology
  • Transport

@Legal_Recruit follows the leading twitter accounts in the world which daily produce news stories, which make excellent narratives for the verbal reasoning assessment that Legal_Recruit will be offering.

Legal_Recruit learners will be able to choose a maximum time permitted from 19 to 39 minutes; this is to that it’s easy to do the assessments with reasonable adjustments for learners who will benefit from them to allow them to perform on a ‘level-playing field’.

It’s interesting that there is no subject bias at all in the exemplars. Interestingly the passages appears to avoid contentious branding, politics, or subjects which are generally controversial.

Tagged ,