Thomas Wood

verfied profile

I am a Consultant Data Scientist with my own consultancy, Fast Data Science.

I help organisations extract value from unstructured data with AI and machine learning. I specialise in text (natural language processing), images, and healthcare/pharmaceuticals.

If an organisation has large volumes of text data, incoming emails to triage, insurance reports, legal or scientific documents, I am the right person to leverage machine learning to extract useful information from the materials.

Career

I originally studied Physics to Masters level at the University of Durham, UK. I moved into the machine learning field in 2007, completing a second Masters in Computer Speech, Text and Internet Technology at the University of Cambridge in 2008.

Since then I have stayed in the exciting field of machine learning for more than 10 years. I have worked in a variety of companies in industries including consulting, pharmaceuticals, computer science, recruitment, retail and security, as well as some research experience.

Since 2018 I have been working as a freelance data scientist consultant, helping large organisations around the globe extract value from unstructured data such as text and images.

You can see information about me personally here.

    Rectified Linear Units Vector Perceptron
    09/27/2020

    Activation Function

    An activation function sets the output behavior of each node, or “neuron” in an artificial neural network.
    Odds (Probability) Estimator (Statistics) Neural Network
    09/27/2020

    Sigmoid Function

    A sigmoid function is a type of activation function, and more specifically defined as a squashing function, which limits the output to a range between 0 and 1.
    Classifier Machine Learning Estimator (Statistics)
    09/10/2020

    Random Forests

    The random forest is a supervised learning algorithm that randomly creates and merges multiple decision trees into one “forest.”
    Supervised Learning Classifier Neural Network
    09/02/2020

    Backpropagation

    Backpropagation, short for backward propagation of errors, is a widely used method for calculating derivatives inside deep feedforward neural networks.
    Generative Adversarial Network Supervised Learning Classifier
    08/13/2020

    Unsupervised Learning

    Unsupervised learning is a deep learning technique that identifies hidden patterns, or clusters in raw, unlabeled data.
    Classifier Receiver Operating Characteristic Curve Harmonic Mean
    08/11/2020

    Precision and Recall

    Precision can be measured as of the total actual positive cases, how many positives were predicted correctly. It can be represented as: Precision = TP / (TP + FP) Whereas recall is described as the measured of how many of the positive predictions were correct It can be represented as: Recall = TP / (TP + FN)
    Classifier Estimator (Statistics) Autoencoder
    07/22/2020

    Generative Adversarial Network

    A generative adversarial network (GAN) is an unsupervised machine learning architecture that trains two neural networks by forcing them to “outwit” each other.
    Supervised Learning Open Source Tensorflow
    07/07/2020

    Transformer Neural Network

    The transformer is a component used in many neural network designs that takes an input in the form of a sequence of vectors, and converts it into a vector called an encoding, and then decodes it back into another sequence.
    ImageNet Classifier Estimator (Statistics)
    05/17/2019

    Convolutional Neural Network

    A convolutional neural network, or CNN, is a deep learning neural network designed for processing structured arrays of data such as images.
    Classifier Machine Learning Harmonic Mean
    05/17/2019

    F-Score

    The F score, also called the F1 score or F measure, is a measure of a test’s accuracy.
    Vector Classifier Machine Learning
    05/17/2019

    Softmax Function

    The softmax function is a function that turns a vector of K real values into a vector of K real values that sum to 1. The input values can be positive, negative, zero, or greater than one, but the softmax transforms them into values between 0 and 1, so that they can be interpreted as probabilities.