Thomas Wood

verfied profile

2 followers

I am a Consultant Data Scientist with my own consultancy, Fast Data Science.

I help organisations extract value from unstructured data with AI and machine learning. I specialise in text (natural language processing), images, and healthcare/pharmaceuticals.

If an organisation has large volumes of text data, incoming emails to triage, insurance reports, legal or scientific documents, I am the right person to leverage machine learning to extract useful information from the materials.

Career

I originally studied Physics to Masters level at the University of Durham, UK. I moved into the machine learning field in 2007, completing a second Masters in Computer Speech, Text and Internet Technology at the University of Cambridge in 2008.

Since then I have stayed in the exciting field of machine learning for more than 10 years. I have worked in a variety of companies in industries including consulting, pharmaceuticals, computer science, recruitment, retail and security, as well as some research experience.

Since 2018 I have been working as a freelance data scientist consultant, helping large organisations around the globe extract value from unstructured data such as text and images.

You can see information about me personally here.

  • Activation Function

    An activation function sets the output behavior of each node, or “neuron” in an artificial neural network.

    Rectified Linear Units Vector Neural Network
    09/27/2020 ∙ 609

    read it

  • Sigmoid Function

    A sigmoid function is a type of activation function, and more specifically defined as a squashing function, which limits the output to a range between 0 and 1.

    Odds (Probability) Estimator (Statistics) Neural Network
    09/27/2020 ∙ 394

    read it

  • Random Forests

    The random forest is a supervised learning algorithm that randomly creates and merges multiple decision trees into one “forest.”

    Machine Learning Estimator (Statistics) Linear Regression
    09/10/2020 ∙ 664

    read it

  • Backpropagation

    Backpropagation, short for backward propagation of errors, is a widely used method for calculating derivatives inside deep feedforward neural networks.

    Supervised Learning Classifier Stochastic Gradient Descent
    09/02/2020 ∙ 394

    read it

  • Unsupervised Learning

    Unsupervised learning is a deep learning technique that identifies hidden patterns, or clusters in raw, unlabeled data.

    Generative Adversarial Network Supervised Learning Classifier
    08/13/2020 ∙ 731

    read it

  • Precision and Recall

    Precision can be measured as of the total actual positive cases, how many positives were predicted correctly. It can be represented as: Precision = TP / (TP + FP) Whereas recall is described as the measured of how many of the positive predictions were correct It can be represented as: Recall = TP / (TP + FN)

    Classifier Harmonic Mean Probability
    08/11/2020 ∙ 287

    read it

  • Generative Adversarial Network

    A generative adversarial network (GAN) is an unsupervised machine learning architecture that trains two neural networks by forcing them to “outwit” each other.

    Classifier Estimator (Statistics) Autoencoder
    07/22/2020 ∙ 1229

    read it

  • Transformer Neural Network

    The transformer is a component used in many neural network designs that takes an input in the form of a sequence of vectors, and converts it into a vector called an encoding, and then decodes it back into another sequence.

    Supervised Learning Tensorflow Neural Network
    07/07/2020 ∙ 161

    read it

  • Convolutional Neural Network

    A convolutional neural network, or CNN, is a deep learning neural network designed for processing structured arrays of data such as images.

    ImageNet Classifier Estimator (Statistics)
    05/17/2019 ∙ 1385

    read it

  • F-Score

    The F score, also called the F1 score or F measure, is a measure of a test’s accuracy.

    Machine Learning Harmonic Mean Geometric Mean
    05/17/2019 ∙ 732

    read it

  • Softmax Function

    The softmax function is a function that turns a vector of K real values into a vector of K real values that sum to 1. The input values can be positive, negative, zero, or greater than one, but the softmax transforms them into values between 0 and 1, so that they can be interpreted as probabilities.

    Vector Classifier Confusion Matrix
    05/17/2019 ∙ 200

    read it