Skip to content

QuantumAI-IITM/GQE_For_Drug_Prediction

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

GQE_For_Drug_Prediction

Nandhakishore C S ( DA24M011 ) Vandit Shah ( NS25Z102 )

Goal

Implement the GQE ( Generative Quantum Eigen Solver ) architecture using a Transformer model trained from the DTITR (Drug-Target Interaction Transformer) paper and evaluate its performance for drug prediction.

Information on the project

Our Approach

DTITR MODEL MODIFICATION

GQE

After we decided to replace the DTITR Model with our custom, we utilised this trained model, instead of GPT2 Model. Here instead of changing the cost function we decided to introduce a new hamiltonian for the task of drug-target interactions. The hamiltonian is as follows: $H = \sum_{i=0}^{n-1} A_i * Z_i + \sum_{i=0}^{n-2} B_i * Z_i * Z_{i+1} + \sum_{i=0}^{n-2} C_i * X_i * X_{i+1}$, where the weights $A_i, B_i, C_i$ are trainable weights. Also, for the operator pool, there being no alternative way to retrieve the actual code, we used the get_operator_pool with num_qubits = 8, num_electrons = 4, and also kept the hartee fock state init unchanged. During training the parameters of transformers are updated using AdamW optimiser, and the weights are being updated using COBYLA optimiser.
One key observation here is GQE was on a stationary target i.e we were training only for $CO_2$ Molecule. Here we attempted to have the transformer learn and adapt to different types of reaction and not learning to create a circuit to predict the pkd values of it.

The learning rate and other parameters havent been sweeped yet.

Results

GQE Result

Conclusion

  • The RMSE is going up which is a concern but in 50 epochs the RMSE has stayed roughly in range of 0.3 - 0.5, while in its classical counterpart the best RMSE observed was approx. 0.29

Future Work

  • Try using RLHF.
  • Sweep for parameters and make RMSE reduce.

References

  1. DTITR
  2. GQE

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages