MATLAB: How to model an equation

modeling equations

Hi there, I have been trying to think of how to model the problem bellow in MATLAB but I couldn't. Can anybody help here please?
Thank you very much for your help.
A ball starts falling down through dense liquid. Its velocity (in cm/s) is given by the equation: dv/dt = 1.6 − 0.025v^2 .
If v(0) = 0cm/s, show that it will take approximately 6.77s for the ball to reach a velocity of 7.0cm
If the container is 1 meter deep, determine, the time the ball has reached the bottom.

Best Answer

  • I think this should work. When you give time t1 for which the velocity you want, and the depth of the container as input, the function returns the velocity at time t1 and the time taken to hit the bottom as the output
    function [velocity, time]= diff_eqn(t1,depth)
    %Inputs given are time and depth of container
    %function returns the velocity at time t1 and
    %time taken to hit the bottom of container
    syms y(t)
    ode = diff(y)+0.025*y^2==1.6;
    cond=y(0)==0;
    ySol(t)=dsolve(ode,cond);
    velocity=double(ySol(t1))
    time=depth/velocity
    end