Theories Behavioral Psychology Variable-Ratio Schedule Characteristics and Examples Variable-ratio schedules provide partial, unpredictable reinforcement By Kendra Cherry, MSEd Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book." Learn about our editorial process Updated on November 08, 2023 Learn more." tabindex="0" data-inline-tooltip="true"> Fact checked Verywell Mind content is rigorously reviewed by a team of qualified and experienced fact checkers. Fact checkers review articles for factual accuracy, relevance, and timeliness. We rely on the most current and reputable sources, which are cited in the text and listed at the bottom of each article. Content is fact checked after it has been edited and before publication. Learn more. by Sean Blackburn Fact checked by Sean Blackburn Sean is a fact-checker and researcher with experience in sociology, field research, and data analytics. Learn about our editorial process Print Table of Contents View All Table of Contents Characteristics How to Identify a Variable-Ratio Schedule Examples Trending Videos Close this video player In operant conditioning, a variable-ratio schedule is a partial schedule of reinforcement in which a response is reinforced after an unpredictable number of responses. This schedule creates a steady, high rate of response. Gambling and lottery games are good examples of a reward based on a variable-ratio schedule. Schedules of reinforcement play a central role in the operant conditioning process. The frequency with which a behavior is reinforced can help determine how quickly a response is learned as well as how strong the response might be. Each schedule of reinforcement has its own unique set of characteristics. Illustration by Brianna Gilmartin, Verywell Characteristics of Variable-Ratio Schedules There are three common characteristics of a variable-ratio schedule. They are: Rewards are provided after an unpredictable number of responses: There is no predictability as to when a reward will be received. It might be after the first response or the fifth, or another number entirely. Leads to a high, steady response rate: When the subject doesn't know when the reward will be given, they will continue to respond each time in the hopes that it will be the one response that results in a reward. Results in only a brief pause after reinforcement: After the reinforcement is received in a variable-ratio schedule, there is just a minor pause in response. This is similar to a variable-interval schedule, in which the post-reinforcement pause is also brief. How to Identify a Variable-Ratio Schedule When identifying different schedules of reinforcement, it can be helpful to start by looking at the name of the individual schedule itself. In the case of variable-ratio schedules, the term "variable" indicates that reinforcement is delivered after an unpredictable number of responses. "Ratio" suggests that the reinforcement is given after a set number of responses. Together, the term means that reinforcement is delivered after a varied number of responses. It might also be helpful to contrast the variable-ratio schedule of reinforcement with the fixed-ratio schedule of reinforcement. In a fixed-ratio schedule, reinforcement is provided after a set number of responses. For example, in a variable-ratio schedule with a VR 5 schedule, an animal might receive a reward for every five responses, on average. One time, the reward would come after three responses, then seven responses, then five responses, and so on. The reinforcement schedule will average out to be rewarded for every five responses, but the actual delivery schedule will remain unpredictable. In a fixed-ratio schedule, on the other hand, the reinforcement schedule might be set at a FR 5. This would mean that for every five responses, a reward is presented. Where the variable-ratio schedule is unpredictable, the fixed-ratio schedule is set and predictable. Variable-Ratio Schedule Reinforcement provided after a varying number of responses Delivery schedule unpredictable Examples include slot machines, door-to-door sales, video games Fixed-Ratio Schedule Reinforcement provided after a set number of responses Delivery schedule predictable Examples include production line work, grade card rewards, sales commissions Variable-Ratio Schedule Examples What does variable-ratio reinforcement look like in a real-world setting? Here are a few examples to consider. Classroom learning: A variable-ratio schedule can be used in the classroom to help students learn. Because students won't know exactly when they will be rewarded for doing their homework, for instance, they may be more inclined to turn in all of the required assignments. Slot machines: Players have no way of knowing how many times they must play before they win. All they know is that, eventually, a play will win. This is why slot machines are so effective and players are often reluctant to quit. There is always the possibility that the next coin they put in will be the winning one. Social media: There are two ways that a variable-ratio schedule appears in social media. One, when you go into your social media accounts, you never know if you'll find any notifications, comments, or likes. Yet, you keep going back to check it to see if anything shows up. Along similar lines, you also never know what is going to show up in your news feed, but you still keep strolling to find posts you like. Sales bonuses: Call centers often offer random bonuses to employees. Workers never know how many calls they need to make to receive the bonus, but they know that they increase their chances with more calls or sales. Door-to-door sales: In this variable ratio example, the salesperson travels from house to house, but never knows when they will find an interested buyer. It could be the next house, or it might take multiple stops to find a new customer. Video games: In some games, players collect tokens or other items in order to receive a reward or reach the next level. The player may not know how many tokens they need to receive a reward or even what that reward will be. 3 Sources Verywell Mind uses only high-quality sources, including peer-reviewed studies, to support the facts within our articles. Read our editorial process to learn more about how we fact-check and keep our content accurate, reliable, and trustworthy. James RJE, O'Malley C, Tunney RJ. Understanding the psychology of mobile gambling: A behavioural synthesis. Br J Psychol. 2017;108(3):608‐625. doi:10.1111/bjop.12226 Comparative Cognition Library University of Iowa. Schedule of reinforcement. Killeen PR, Posadas-Sanchez D, Johansen EB, Thrailkill EA. Progressive ratio schedules of reinforcement [published correction appears in J Exp Psychol Anim Behav Process. 2009 Apr;35(2):152]. J Exp Psychol Anim Behav Process. 2009;35(1):35‐50. doi:10.1037/a0012497 By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book." See Our Editorial Process Meet Our Review Board Share Feedback Was this page helpful? Thanks for your feedback! What is your feedback? Helpful Report an Error Other Submit