Replies: 1 comment 2 replies
-
|
Great to hear from you @cd155! Yes, we could teach Drasil about linear regression in the same way that we taught it about ODEs. We could build on that to teach Drasil about machine learning. We could interface with existing machine learning libraries, rather than generate all the code within Drasil. @JacquesCarette and I have spoken about this in the past. It feels like a very likely direction for future research, but it feels premature to pursue it right now. We have been working with @balacij (current PhD student) to refactor the implementation of Drasil. Building too much knowledge into the current implementation will incur technical debt that we would have trouble paying in the future. 😄 Hopefully we will get to expanding Drasil's knowledge soon. Future students would likely be interested in machine learning projects. If we can remove the drudgery of creating machine learning code, allowing the focus to shift to domain knowledge, we could have an impact in the ML community too. 😄 |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
@smiths @JacquesCarette While studying machine learning-related topics, I noticed that Drasil may not be far away from solving an optimal problem, such as minimizing mean square errors (
MSE), by using gradient descent. Drasil has some knowledge of ODEs, including partial derivatives, which is central in gradient descent. The algorithm runs a for loop, and each time we can use the matrix product to update weights. If Drasil cannot handle matrix products, adding a third-party library, such asnumpy.dot, can help.Solving min(
MSE) can help to create a linear regression model, which is widely used in machine learning.Beta Was this translation helpful? Give feedback.
All reactions