I want to know when an embedded system is being developed, what is the role of "delays" or "timers" in micro-controller exactly ? As far as I have learned that when any hardware is developed with micro-controller, that device needs certain delays for it's certain tasks in order to skip bouncing effects. When there is a hardware device that provides data and software receives the data through serial communication, is not it mandatory to provide certain delays on micro-controller while it is passing data to software ? I have tested with arduino, that if I do not give any delay, and randomly keep sending data, the software remains busy in receiving data, and does the other works very very late. And I assume that it is because the serial communication event remains busy all the time as there is no delay given in micro-controller programming. So I want to know how correct I am ? And what are my lacking ?
digital electronics is almost by definition time based. you have some logic or a processor (which is just some logic), etc that runs off of some "clock" at every clock tick the synchronous logic is evaluated, between clock ticks the combinational logic is settling based changes at the last lock tick.
Not always but often a microcontroller is used as a programmable replacement for logic. It is itself logic remember, just pretty easy to program and sometimes cheaper than a cpld or fpga. So you may have other time based things you need to do. You may want to sample an input, maybe logic it maybe do math on it. But if you only need to sample it 10 times a second, but your software and processor is fast enough to sample 100 times a second is that better or is that worse? Maybe you change a line of code and it now samples at 73 times a second, but the sampling rate feeds into the math you are doing. What if you used a timer, just some logic in the mcu that uses the mcu clock to keep track of time in ticks. And you used that to measure a tenth of a second. Like setting a timer to check on food that is cooking. You could stare at it constantly or set a timer and do other things. Same here.
The timer can be used to space things out, how often you check something, how often you tell somebody something. There may be logic or a bus you are talking to that you cannot change the output no more often than some amount of time. So you use a delay, which ideally uses a timer. Change the output then I cannot change it for another 22ms, so delay 22ms then change it again, delay, etc. or maybe you need to be more accurate than that and you come up with timer based code or use features in the mcu, in order to do that.