• TCLA Premium: Now half price (£30/month). Applications, interviews, commercial awareness + 700+ examples.
    Join →

TCLA Vacation Scheme Applications Discussion Thread 2025-26

Yes. I understand the concept of construct vadility, and causation etc etc (and it did talk about these types of study, rather than the specific studies - I felt the studies I presented showed slightly better examples than the ones referenced in the paper - although I can't remember the ones in the paper tbh so may be wrong here).

But the fact is it hasn't produced empirical evidence that formal logic training can impact WG scores. Perhaps, combined with other factors I agree that it can - i.e. lack of probabilistic judgement when taking the test, not understanding what the types of questions are looking for. I disagree with formal logic training lowering scores, from personal perspective, however there is no study showing this either way.

Knowing which parts of the test to apply deductive standards, and which parts to apply inductive standards could penalise logical reasoning, however that also comes down understanding the non-technical norms of the test (which can be accomplished through specific understanding of the test, as well as practice tests).

But regardless, it's like economics: all models are wrong, some are useful. There will be flaws in any standardised testing, some more than others. A levels, GCSEs and university exams are much better to go on (but by no means perfect), but when thousands of candidates are meeting the minimum requirements the WG is a very easy to administer test, and shows correlation to performance (reguardless of if it can accurately assess critical thinking). Formal logic training without understanding/practicing the test and knowing where to use inductice and deductive reasoning can theoretically be a disadvantage (due to the ambitious nature of questions), I accept that. But when knowing and understanding the "test logic", I felt it a big help. Again, I'll agree to disagree here, I have absolutely nothing to win or lose in this argument², just saying what helped me with the test, and why I think it's a better (and more reliable) test than SJTs.
I don’t think you’re understanding what you’re arguing here .

The Watson Glaser shows weak subscales / poor internal consistency.

Studies have also shown the watson Glaser essentially has far lower predictive power for on job performance once academic background etc was taken into account .

Once you actually dig into it “empirically “ the Watson Glaser is not what you’re saying it is. I also have not argued for SJT at any point , it’s another screening tool that lazy firms use to screen out candidates.
 
Are Freshfields yet to send out VS AC’s? As I’ve only seen DTC AC invites on here. Getting a bit nervous as I’m yet to hear…
I doubt it. From what I know most ACs for VS were before Christmas and offers were given out, then additional AC’s were held in Jan for vs. Since they have started giving dtc invites recently I’m assuming they might be done w vs interviews? all assumptions anyways based on the timeline
 
hi guys! just a question to anyone who has experienced the same emotional issue. I have a clear list of firms that have excited me to bits and I have tried to put so much effort in the applications but haven’t had any success with them.

I have been applying more of course, and I have finally secured my first AC (it’s my 2nd application cycle). However, I am honest with myself that I am not really excited about this firm. Like, I am and I do have a strong interest in some of their practice areas. Plus, I know that it is a great learning opportunity and I might really enjoy it there, but something still makes me not so fascinated about preparing for AC.

How to make yourself super energised about the process? Has anyone had the same feeling?
 
I'm just wondering, since I'm in 2nd year, if I can't secure a TC this cycle and end up applying again in my final year, won't that significantly reduce the pool of firms I can apply to? from what I've seen loads require that you're in you're 2nd year of a law degree? does that mean I'll have to wait until after I graduate to apply again?
 
I'm just wondering, since I'm in 2nd year, if I can't secure a TC this cycle and end up applying again in my final year, won't that significantly reduce the pool of firms I can apply to? from what I've seen loads require that you're in you're 2nd year of a law degree? does that mean I'll have to wait until after I graduate to apply again?
The requirement is at least in your second year of a law degree or final year of a non-law degree. I don't know of any firm that doesn't allow final year law students to apply...
 
I'm just wondering, since I'm in 2nd year, if I can't secure a TC this cycle and end up applying again in my final year, won't that significantly reduce the pool of firms I can apply to? from what I've seen loads require that you're in you're 2nd year of a law degree? does that mean I'll have to wait until after I graduate to apply again?
I am now applying in my final year and I must say I almost never have any issues! I think the majority of firms are absolutely okay with it.
 
  • Like
Reactions: LaDiva
I'd just like to quickly chime in about the WG/SJT discussion.
Standardised testing is fine and objective all you want. However, the WG and SJT are not exactly standardised.

Wiki defines ST as: a test that is administered and scored in a consistent or standard manner. It follows that these tests should be easy to practice, consistent to one another, and (most importantly) have correct answers.

WG: is administered by loads of different providers with different standards as to what is "correct." This makes it almost impossible to practice and do consistently well in, because correct answers fluctuate massively depending on the test provider. In addition, certain sections are weighted differently depending on the administrator or the firm.

SJT: do not even get me started. These -- cappfinity in particular -- have absolutely no correct answer. The options each measure a different trait, and lord knows how the slider ones are scored. People routinely get the same quality as their best and worst after taking the same test, which is a clear indication that these tests are bullsh*t. Absolute laziness from firms tbh.

And that is before taking into account how these are used in recruitment. If these were being used consistently and in a standard manner, candidates who score highest would automatically progress, right? But that is not the case. In addition, I really dislike how these are used to "predict behaviour" that is seemingly inconsistent with reality.

TLDR: i am frustrated at the opaque and frankly somewhat lazy recruitment process.
 
I'd just like to quickly chime in about the WG/SJT discussion.
Standardised testing is fine and objective all you want. However, the WG and SJT are not exactly standardised.

Wiki defines ST as: a test that is administered and scored in a consistent or standard manner. It follows that these tests should be easy to practice, consistent to one another, and (most importantly) have correct answers.

WG: is administered by loads of different providers with different standards as to what is "correct." This makes it almost impossible to practice and do consistently well in, because correct answers fluctuate massively depending on the test provider. In addition, certain sections are weighted differently depending on the administrator or the firm.

SJT: do not even get me started. These -- cappfinity in particular -- have absolutely no correct answer. The options each measure a different trait, and lord knows how the slider ones are scored. People routinely get the same quality as their best and worst after taking the same test, which is a clear indication that these tests are bullsh*t. Absolute laziness from firms tbh.

And that is before taking into account how these are used in recruitment. If these were being used consistently and in a standard manner, candidates who score highest would automatically progress, right? But that is not the case. In addition, I really dislike how these are used to "predict behaviour" that is seemingly inconsistent with reality.

TLDR: i am frustrated at the opaque and frankly somewhat lazy recruitment process.
Agree completely with this .
 
I just completed the WBD Neurosight test. My first test of this type.

That's an interesting one. You place your mouse in a set position, a question appears, then after 5 seconds a few possible choices appear on screen and you move the mouse to the one that reflects your immediate thoughts. I heard that they track mouse movements to gauge authentic responses.

Makes a nice change from WG, of course!
 

About Us

The Corporate Law Academy (TCLA) was founded in 2018 because we wanted to improve the legal journey. We wanted more transparency and better training. We wanted to form a community of aspiring lawyers who care about becoming the best version of themselves.

Get Our 2026 Vacation Scheme Guide

Nail your vacation scheme applications this year with our latest guide, with sample answers to law firm questions.