Influenza© CDC

Rosetta, the Pioneer of Protein Structure Prediction

Rosetta is a comprehensive computational suite that plays a pivotal role in the protein folding field by predicting and designing protein structures based on amino acid sequences. It employs a combination of physics-based energy functions and advanced algorithms, such as fragment assembly and Monte Carlo sampling, to simulate the folding process and explore the vast conformational landscape of proteins. By iteratively optimizing potential structures, Rosetta helps researchers identify low-energy, stable configurations that closely resemble naturally occurring proteins. This tool not only aids in elucidating fundamental principles of protein structure and function but also supports the design of novel proteins and therapeutic interventions, making it an indispensable resource in structural biology and bioengineering.
Read more
AlphaFold© Karobben

Hemagglutinin, the Influenza Virus Protein

Hemagglutinin is a protein found on the surface of the influenza virus. It is responsible for binding the virus to the host cell, initiating the infection process. Hemagglutinin is a target for the immune system, and antibodies against it can prevent infection.
Read more
Protein Loop Refinement© Karobben

AlexNet

AlexNet is a convolutional neural network that won the ImageNet Large Scale Visual Recognition Challenge in 2012. It was designed by Alex Krizhevsky, Ilya Sutskever, and Geoffrey Hinton. The network has eight layers, five of which are convolutional layers and three are fully connected layers. It uses ReLU activation functions, dropout for regularization, and data augmentation techniques to improve performance. AlexNet significantly advanced the field of deep learning and computer vision.
Read more
esm, Evolutionary Scale Modeling© Karobben

esm, Evolutionary Scale Modeling

ESM (Evolutionary Scale Modeling) is a family of large-scale protein language models developed by Meta AI. They’re trained on massive protein sequence databases, learning contextual representations of amino acids purely from sequence data. These representations—often called embeddings—capture both structural and functional clues.
In practice, you feed a protein sequence into ESM to obtain per-residue embeddings, which you can then use for downstream tasks like structure prediction, function annotation, or variant effect prediction. If you batch multiple sequences together, ESM aligns them by adding special start/end tokens and padding shorter sequences to match the longest one. You then slice out the valid embedding region for each protein, ignoring any padding.
Read more

PCA

PCA
Read more

AI: Logistic Regression

Logistic regression is a supervised machine learning algorithm used for binary classification tasks. Unlike linear regression, which predicts continuous values, logistic regression predicts the probability that a given input belongs to a certain class.
Read more

Regularization

Regularization is a way to make sure our model doesn't become too complicated. It ensures the model doesn’t overfit the training data while still making good predictions on new data. Think of it as adding a 'rule' or 'constraint' that prevents the model from relying too much on any specific feature or predictor.
Read more
pyrosetta© Karobben

pyrosetta

pyrosetta
Read more
Heatmap with GGplot© Karobben
GGplot: Prism style© Karobben