[Note: Many of these reviews are years, and sometimes decades, old. So, check the publication dates and don’t expect to find similar conditions as reported by the author(s). We’re publishing these reports for their historical value, plus, they may give you some leads on games to investigate. – A.S.]
A little about blackjack in Russia first. This is a blackjack magazine, isn’t it?
Every casino in Moscow offers blackjack and many are good for card counters—at least until you’re blacklisted. As always, the standard rules are 6 deck, dealer stands on soft 17, double on any two cards, double after splits, resplit pairs, resplit aces (1 card on each), early surrender, European no hole card (dealer takes all).
Five casinos offer the triple rule on blackjack games (Golden Palace, Golden Palace Weekend, Crystal, Studio, and Imperia), which means you can triple down after doubling down. Some casinos offer bonuses for hands like 678 and 777 (Horseshoe, Alexander Blok, and Izmaylovo). Some casinos have early surrender against 10 only (Cosmos, Mizar, Metropol, Izmaylovo, Infant, Alexander Blok, Beijing In Moscow). Some offer loss rebates (Royal, Ambassador, Grand Prix, and Casablanca).
What’s the news? The Golden Palace group bought Imperia. Not only they have they added the triple rule but the same blacklist of card counters as the Golden Palace.
Also, the new casino SOL was opened. They offered triple during their first month but soon killed it and introduced two Double Exposure Tables (blackjacks pay 1:1, no surrender, dealer wins ties, DOA, DAS, NRS).
Casino Oasis changed its name to the Lilit but no change in the rules. Crystal killed penetration — 50% on all blackjack tables now. Izmaylovo pays 2:1 for suited blackjacks. Vinso Grand adds 25% to your blackjack payoff during 2 lucky hours per day. Shangri La has unlimited insurance up to ½ table max.
During the next few weeks the new casino Pharaoh will open. No information about blackjack rules. It looks like Cosmos has some financial interest in this casino. In February Golden Mine should be open instead of the temporarily closed Treasure Island.
The general trend in Moscow blackjack for card counters does not look well. Penetration gets worse and the heat keeps going up. More and more casinos are getting paranoid. Information between casinos is circulated more frequently. More and more card counters are sticking with Caribbean Stud due to lesser heat.
History of Casino Play in Russia
Before I talk about opportunities in poker and some other games, here is a brief history of gambling in general and advantage play in the territory of former Soviet Union.
As you may know, after the Socialist Revolution and until “perestroyka,” all gambling in the USSR was prohibited by law. The only exception was state lotteries (with a tremendous state advantage). Despite these prohibitions, home card games and such games as backgammon, checkers and chess were quite popular. By the way, maybe it’s because of this that Russian chess players are so strong — all the people had nothing to play except chess. : – ))
Durak and Preference
The most popular home game in Russia was and still is not poker, like in US, but the game called “durak,” which translates as “the fool.” This game is played with a deck of 36 cards and has very simple rules. Perhaps every Russian citizen of the age of 7 and higher knows how to play this game.
Good play at durak requires a lot of memory and mental calculations. There are even championships here in this game now. “The fool” is considered by the public opinion as a sucker game. But if you play seriously there is a lot of work and a lot of potential money won due to a lot of bad players through all the country. The last championship was won by Partizan — a famous person and professional player with phenomenal memory. You know, one of those guys who can multiply 7-digit numbers within seconds in his head…
Another most popular card game is called “Preference,” and is considered by public opinion more elite, for intelligence. It has quite complex rules and is hard to master. Usually one starts to learn Preference at high school or university. Almost all students know the rules and many know how to play at a medium-to-high level. This game, like poker, should be played for money only (The Fool is a different type of the game — it’s a family game for interest until you play it professionally). There are a lot of pro level Preference players.
There are a few other card games that Russians like to play too. I don’t want to go into details.
So here was the picture: to gamble was not legal, but all the country in fact did play various games, sometimes betting big money. This led to a lot of cheating. Decades of hidden gambling created thousands of scams, tricks and cheats.
There were so many suckers around you didn’t need to be a sleight-of-the hand wizard. It was enough to master a couple of the simplest tricks such as false cutting to make a decent living on this. But anyway there were and still exist true professional high-class cheaters playing high stakes. So I urge all who wish to play home games in Russia to be very cautious. Marked cards, electronic devices, false shuffles, and coolers — you never know what you may encounter.
So now come the casinos. After perestroyka casino gambling was legalized and A LOT of casinos were opened. In 1991 there was 72 casinos in Moscow alone. Most of them were owned by the mob. This led to crime too. Cheating at casinos, “no-pay” cases and money laundering were widespread. Those were very unstable years. And there were really great rules!
Now the situation is far better. You won’t find cheating casinos in Moscow now. There were two “no-pay” cases last year at small casinos. The situation outside Moscow, however, especially in small local casinos, is unclear. They all have a very small reserve fund and winning big can cause a “no-pay” incident. There are about 40 casinos in the capitol and about 550-570 in total on the former USSR territory.
The most popular casino game here is roulette. I think it goes back to Dostoevsky and his description in precise detail life of the gambling-addict character in The Gambler. Roulette here is always single-zero. And theoretically you can find very old and biased wheels in small towns.
Poker in Russia
Poker. Let’s talk first about stud poker. This game grows in popularity. The most common type is 5-Card Draw with joker. This game is extremely popular and can be very profitable if you know your opponents. I think that Crazy Mike Caro could make a decent living here. 5-Card Draw is the only poker you will see played at home games, outside casinos. Home games are always limit and stakes depend on the players. Only in casinos do there exist pot-limit games and no-limit is played in tournaments only.
Other variants of poker you can find in only very few casinos here. Cosmos is the most known place. The most popular game there is pot-limit Hold’em with a $5 ante. There are 7-Card Stud and Omaha too. Such games as Lowball or Hi-Lo Split are extremely rare, even in tournaments. Casinos Crown and Shangri La usually offer 5-Card Draw or 7-Card Stud, rarely Omaha or Hold’em. There is a new poker room at the new SOL casino but I don’t know about the game.
Talking about the skill level of players, I can say that the average level is very low but there do exist a few pro level players. Usually they play at Cosmos.
Caribbean Poker in Russia
Caribbean poker. Yes, I know that this game is not very popular in the US. But it’s not the same in Russia. I think it’s the most popular casino card game here. That’s because the rules of this game differ greatly from international standards. In fact, some types of this game can be very advantageous.
Here is the most common set of rules: 52-cards deck, 6 boxes on the table. You can play 2 boxes (that means you can see your cards on the second box only after you make your decision on the first). You can change 1 or 2 cards after you see your hand. It costs your ante. Minimal game is Ace, King. If you think a little you will find that second box has an advantage over the house!
And here is the next step: Your friend is sitting on the next two boxes and you are signaling your cards. Several teams here use such a ploy with great success. And I’ve not described the most advantageous rules! There exist 6-cards Caribbean poker, sometimes you can change all your five cards for one ante, sometimes you can FORCE the dealer to change his card for one ante, etc., etc., etc. All these rules can lead to a great advantage, especially with team play. The only minus is that this game is very slow.
Russian Lotteries
Lotteries. Yes, lotteries! These also can be profitable here. ANY casino here offers some lottery tickets. Prizes vary from $5000 and up to $500,000 (no kidding!).
If you’re a frequent customer you can collect A LOT of those tickets and have nice odds in these lotteries. If you’re a hardcore casino visitor and so lucky that you win every casino lottery during a week in Moscow you can find yourself richer by about 5-7 brand new cars and about $100,000 in cash. So, theoretically using a large team of players at one casino can give you nice chances to win at lotteries.
Other games besides blackjack in casinos in Russia. Baccarat and Craps are very unpopular, a couple of tables for all of Moscow. One Pai Gow table, one Wheel Of Fortune, one SicBo table. Nothing spectacular and almost no action.
Luck, Garry Baldie â™
If you’re going to be traveling to play blackjack, see Stanford Wong’s Basic Blackjack for optimal basic strategy for any of the great rule sets you may encounter around the world. Wong’s Professional Blackjack contains the index numbers for card counters for unusual rule sets.
Poker players know this as the “Dead Man’s Hand.” Legend tells us that these are the cards Wild Bill Hickok was holding when he was shot down at a poker table in Deadwood, South Dakota, back in 1876.
To basic strategy blackjack players aces and eights have entirely different meanings. The common wisdom spouted by experienced players, dealers and pit bosses, is: “Always split aces and eights.”
Virtually all players would agree that splitting aces makes sense as blackjack basic strategy. Who wouldn’t trade one hand starting with a total of 12, for two hands starting with 11 each? Only a moron would keep the 12.
But eights?
Sure, it makes sense to split those eights in two when the dealer is showing a potential bust card. Who wants a blackjack hand totaling 16? No one. Ever. So if the dealer’s got a five up, or a deuce, or any other pitiful low card, I’d rather take my chances with two hands starting with 8, than one lousy, rotten 16.
But that’s as far as the “common wisdom” may seem to make sense. When the dealer’s got a scare card showing-any nine, ten, or ace-why on earth would I want to split my 8s? Sure, I know a 16 still looks like a loser. But two hands starting with 8 each against these scare cards just looks like two losers.
Or, so it seems…
In Beat the Dealer, Ed Thorp says that one of the hands that convinced the pit bosses that he was a complete fool back when he first hit Las Vegas as a blackjack card counter back in 1960 was a pair of eights. As per his computer analysis, he always split them. Before computers came along, it was not common wisdom to always split 8s. Is there an understandable logic behind this basic strategy play?
As a matter of fact, there is.
What if, instead of splitting 8s, casinos allowed us to “toss” one eight whenever we were dealt a pair, and take our chances instead with whatever card the dealer dealt us to replace it?
That’s a no-brainer. A total of 8 isn’t nearly so bad a start on a blackjack hand as a total of 16. With the 8, you have a pretty decent chance of drawing a ten for an 18 total. And although I’m not exactly thrilled with the idea of pitting my 18 against the dealer’s ten up I sure do like it better than a 16! And I might even draw an ace, 2, or 3 on my 8, giving me a chance at a much stronger hand than 16.
It’s not hard to see that a hand of 8 is a whole lot better than a 16 against a dealer ten. The 8 still looks like a loser, but nowhere near as bad of a loser. If I could toss one 8, I’d do it in a heartbeat. But casinos don’t give me the toss option.
The option they do give me–splitting–seems bad because I’ve got to put more money on the table on a loser. Whether or not two hands of 8 each is better than one hand of 16 is really a question of which option loses the least. And I can only answer that if I know exactly how much more a 16 total costs me–over time–than an 8 total. How do I figure that out?
In 1956, a small group of mathematicians (Baldwin, Cantey, Maisel, and McDermott–memorialized by blackjack pros ever since as “The Four Horsemen”) used old-fashioned adding machines to calculate the answer to this and every other blackjack basic strategy question by tediously running through the math of every possible outcome. Their conclusion: Split the 8s. Even with twice the money on the table, they found, you’ll fare better than you would taking your chances with that lousy 16.
For example, if the player has a sixteen versus a ten, with a $10 bet on the table, in the long run he will lose, on average, 40 cents on the hand. If he has a pair of 8s and he splits the 8s, so that he has two $10 hands starting with a total of 8, on average he will lose 16 cents on each of these hands, for a total loss of 32 cents. If you play the 16 against a ten instead of splitting the 8s, the long run cost is 8 cents for every ten dollars starting bet.
Splitting 8s versus dealer high cards is a classic defensive play. We know we will lose money on the play, but less than we would otherwise have lost.
Unfortunately, few players of that time believed the The Four Horsemen. Then, in 1962, Ed Thorp ran the same hand through his IBM computer and, in a fraction of the time it took those mathematicians, confirmed their answer. That 16 total is so bad, that when you see two 8s, you should throw the extra bet out there to split them with only one thought in your head: Thank heavens that sixteen isn’t composed of a 9 and a 7, or a ten and a 6! With two 8s, you still have a fighting chance.
So, painful as it is, you should always split those is. But if you ever find a casino that offers the “toss” option, don’t split, toss! You heard it here first. â™
How to Use Frequency Distributions to Determine Your Card Counting Win Rate and Fluctuations
[Note from the author: In 1987, I published a 5-book series: Beat the 1-Deck Game, Beat the 2-Deck Game, Beat the 4-Deck Game, Beat the 6-Deck Game, and Beat the 8-Deck Game. Each book has 64 pages and consists primarily of charts that show the “frequency distributions” of the house/player advantages based on deck penetration, showing the profit potential of the game based on the player’s bet spread. Essentially, if you find a game with x-number of decks and x% penetration, you can flip through the charts to see how much of a bet spread you would need to beat the game sufficiently. The updated 2005 editions of these books are available from Cardoza Books. – A.S.]
[Thanks to Sam Case for lots of help with the original 1987 edition of this report. And thanks to Radar for editorial assistance on the 2005 editions.]
FOREWORD — IMPORTANT This is a guidebook for serious blackjack players who are attempting to get the edge over the casinos. Its purpose is to help you work out your win rate, fluctuations, and optimal blackjack betting strategy in the six-deck game. I’m making some assumptions about the users of this book.
1) I assume you are familiar with the game of casino blackjack as it is played in most legitimate casinos. By this I mean that you understand the rules of the game sufficiently to play comfortably, correctly employing the available rule options. This is not a primer.
2) I assume you know basic strategy for the game you are playing. Basic strategy decisions for the various hands have been known for some fifty years now. Virtually all books on card counting—and there are dozens of legitimate card counting systems in publication—provide essentially the same basic strategy decisions.
3) I assume you are using a legitimate card counting system to obtain an advantage at the game. Such a system would provide you with plus and minus values to apply to the various cards, with instructions on how to vary your bet and alter your playing decisions according to your count. If you are not a card counter, the information in this book will be of little use to you. If you are a card counter, the information herein will help you maximize your profits in the games available to you.
4) I assume that you have an understanding of the most common rule options offered in U.S. casinos. There are, in fact, dozens of rule options used in various casinos, and most books on card counting explain the most common variations available.
5) I assume that the blackjack games you play in are traditional blackjack games. The information provided in this guide does not apply to “Spanish 21,” Superfun 21,” single-deck blackjack where “BJ Pays 6-to-5,” Internet blackjack variations like “Pontoon” or “Blackjack Switch,” etc.
In fact, the information presented here will be of no use to those who play in web casinos, even in traditional blackjack games, because web casinos reshuffle the cards between every round of play, and the information in these pages is all based on deck “penetration.” This book deals with games in which at least 50% of the cards in play are being dealt out between reshuffles.
In other words, I assume that the user of this guide is a knowledgeable card counter. I will not spend much ink within these pages explaining the basics. I will recommend my own book, Blackbelt in Blackjack , for any blackjack player who wants to employ a card counting system or use other professional methods of winning at blackjack, from the beginner level to the advanced professional player.
The purpose of this book, Beat the 6-Deck Game, is to take a good card counter and turn him into a 6-deck expert. If you play in games with various numbers of decks, then I advise you to invest in the other reports in this series, which cover 1-deck, 2-deck, 4-deck, 6-deck and 8-deck games. The best way to comprehend the mathematics employed in this series is to read with a pocket calculator handy. Don’t be afraid to scribble in the margins with a pencil. All of the charts and numbers may appear forbidding at first glance, but the math is easy if you follow along.
You will not have to perform any of the mathematics within these pages while you are playing at the tables. The purpose of this book is to give you a clear understanding of how the various conditions you will find in the casino will mathematically affect your potential for winning—and how you must alter your attack on a game in order to beat it. This is not done while playing.
In fact, I have made every attempt to thoroughly analyze all of the most common attacks—and many uncommon attacks—on the various games, so that you will not have to do any math whatsoever if you prefer not to. I have tried to explain the charts clearly enough that you will be able to understand them at a glance. Please read the text carefully so that you can use the charts easily and accurately. —Arnold Snyder
What is a “Frequency Distribution?”
Definition: As used in this book, a frequency distribution is a table of numbers that tells us how frequently any player (or house) advantage occurs in a casino blackjack game.
In blackjack, we can analyze a card counting system and figure out our win/loss rate by using a frequency distribution. We know that the count is continually going up and down, and that sometimes the house has the edge, sometimes the player.
But to know exactly how much I expect to win or lose in a specific game, using some specific betting spread based on my count, I need more details. What I need to know is how often the different house/player advantages occur. That is, what are the frequencies of player advantages of 1%, 2%, 3%, etc.? And what are the frequencies of house advantages of 1%, 2%, 3%, etc.?
A player advantage of 2% over the house may occur 3 or more times per hundred hands, or only 3 times (or fewer!) per thousand hands, depending on the number of decks in play and how deeply into the deck(s) the game is dealt between reshuffles. The differences in advantage frequencies between a game with deep penetration and one with poor penetration make a HUGE difference to a card counter’s potential win rate.
In fact, many blackjack games are not beatable with any practical card counting strategy. And in many beatable games, your edge over the house is so small that the inevitable bankroll fluctuations will spell doom for the player with limited funds.
These Beat the Deck guides will not teach you how to count cards. But they will teach you how to choose games that can be beaten. If I know how often the player advantages and disadvantages occur in a game, I can figure out how much of a betting spread I need to beat the game. I can also figure out how much I should bet at each specific count in order to get the highest win rate I can obtain. And I can use these frequencies to figure out a betting strategy for minimizing my bankroll fluctuations.
There are two ways to draw up a frequency distribution. You can run a computer simulation of your system and then just look at the data the computer spits out. Or, you can use a mathematical formula for deriving the precise data you seek. There are arguments in favor of both methods. In this book, and in all of my Beat the Deck reports, I have used a combination of these two methods.
My goal in these reports is to provide data that can be used by all card counters, not just card counters using some specific count system. The advantages that occur in a casino blackjack game are about the same for all valid card counting systems. Some of the “advanced” systems can squeak out a bit more of an edge over the house, and some of the simpler systems are slightly weaker, but the actual differences are relatively small.
The information presented in these Beat the Deck reports is for an “average” card counting system. They will be highly accurate for a player using the Hi-Lo count from my book, Blackbelt in Blackjack. The more advanced Zen Count, also from Blackbelt in Blackjack, will be slightly stronger. The Red Seven Count (same book) will be slightly weaker; but the charts will be pretty accurate for all of these systems, as well as most other popular counting systems, especially in helping you to choose games and estimate the betting strategies needed to maximize both your profits and your chance of survival.
Technically, we cannot answer the question, “How frequently will a card counter have a 2% advantage over the house?” unless we first specify five conditions:
What are the rules of the game?
What counting system is the player using?
How many players are at the table?
How many decks are in play?
How many cards is the dealer dealing out prior to reshuffling?
For the sake of simplifying our analysis, we’re going to comment briefly on the first three conditions, so that we can concentrate on conditions #4 and #5, which are most important.
#1: The Blackjack Rules
There are dozens of common rule variations in U.S. casinos. Each rule has some positive or negative effect on the basic strategy player’s expectation as well as the card counter’s potential win rate.
One factor that simplifies our analysis is that most casinos in the U.S. have settled on a set of rules that gives the house approximately a ½% advantage over the player. In other words, the player starts at a disadvantage of -0.5%. In fact, 90% of the traditional blackjack games in U.S. casinos have a house advantage between -0.4% and -0.6% off the top of the deck.
[Editor’s note: This has been changing in recent years, since the introduction of substandard payouts on naturals and the spread of such games. If you’re playing a game where blackjack get paid 6:5 or even money, you’re playing a game with a much higher house edge.]
Likewise, most rule variations have only a minor effect for a card counter who is helped (or hurt) by the rules’ limitations on his play. So, for the sake of simplicity, all of the frequency distribution charts in this book assume that the house has a ½% advantage over the player off the top of the deck. This is your standard Las Vegas 6-deck game, where the dealer stands on soft 17 and the player may double down on any two cards.
If the dealer his on soft 17, but the player may double down after splits, the starting advantage is about the same. With late surrender allowed and resplitting of aces, with the dealer stands on soft 17, the house edge is only about 0.3% off the top. If you find yourself playing in a 6-deck game where the rules are more or less favorable than -0.5% off the top, you may use these frequency distributions as they appear here, but remember to adjust your final expectation up or down accordingly.
It would be more accurate to develop a separate frequency distribution chart for each set of rules—but you would not find the final result to be far from the simplified method I am advising. Because 6-deck games with starting advantages very different from –½% are rare, we will not bother analyzing them in more detail.
#2: The Card Counting System
Most valid card counting systems perform within one- to two-tenths of a percent of each other in computer simulations. So, instead of presenting the distribution to show how frequently each positive and negative “true” count will occur, I’m providing the data on how frequently each positive and negative advantage will occur.
It is, of course, necessary for you to know how your count relates to your advantage. If you use any of the counting systems from my book, Blackbelt in Blackjack, which adjust the running count according to the True Edge method, then your true edge count (minus the house advantage off the top) is your advantage. If you are using Stanford Wong’s Hi-Lo Count from his book, Professional Blackjack, then each true count (or count-per-deck) is equal to a 0.5% change in your advantage.
With a higher level count such as Wong’s Halves Count, Revere’s Point Count, or Uston’s APC, each count-per-deck is equal to (approximately) a 0.3% change in your advantage. If you are using a card counting system from a book that does not clearly explain the value of each point increase, then I would advise you to seek out a more advanced text. The system you are using may be valid, but if the author fails to provide you with a method for determining when you get an advantage, and how much of an advantage you get in percent as the count rises, then the book is just too elementary for a serious player.
With an unbalanced “running count” system, such as the simple Red Seven Count or Knock-Out Count, the values of each running count are constantly changing as the deck is depleted. In analyzing the Red 7 system with frequency distributions, it is important to note that your “pivot” will always reflect a 1% positive change from your starting advantage.
With simplified systems like these, you can use the charts in these Beat the Deck guides to compare the potential profitability of the games available to you, and to get a handle on how deep the penetration you will need, the betting spread required, etc. But to get really usable data on fluctuations, bankroll requirements, optimal betting strategies, etc., you will either need to use the True Edge method of running count adjustment, or switch to a balanced card counting system like the Hi-Lo Lite.
#3: Other Players at the Blackjack Table
In six-deck games a frequency distribution will look different if one player is at the table than if more than one player is playing. Technically, you would need seven different frequency distributions to estimate your potential win rate in games where the number of players varies from one to seven.
The problem with drawing up “accurate” frequency distributions to cover all possible situations is that there are hundreds of possible situations and these situations are always in flux. You might have five players at the table, with 75% (4½ decks) dealt. The very next shoe you might have three players at the table, with closer to 80% (4¾ decks) penetration, etc. The next shoe, with six players at the table, a new dealer comes on who deals more slowly, cuts the penetration to about 72%, delivering fewer hands per hour. Imagine the possibilities, with anywhere from one to seven players at the table, varying penetration between shoes, and the idiosyncrasies of various dealers.
All of the distributions presented in this guidebook assume you are alone at the table. If you are not alone at the table, the distribution will still be fairly accurate, assuming you use all of the information available to you when making your betting and playing decisions. It will be slightly more advantageous for you to sit at third base, or as close to third base as possible, as this will allow you to play your hands with more information based on the hit cards of players who must play before you. As a rule, in all shoe games, when playing with other players at the table, always seek out situations in which you can see as many cards as possible prior to playing your hand.
#4: The Number of Decks in Play
A frequency distribution for a six-deck game differs so widely from a frequency distribution for a four-deck game or an eight-deck game that it is necessary to draw up separate distributions based on the number of decks in play. Do not attempt to use the distributions in this guidebook to approximate your advantage in any game other than a 6-deck game. Separate guides are available from the publisher for one-deck, two-deck, four-deck, and eight-deck games.
#5: Card Counting and Deck Penetration
Once you know how many decks are in play and the house advantage off the top afforded by the rules, the prime concern of the card counter is the deck penetration—i.e., how many cards are being dealt out prior to reshuffling. In a 6-deck game, if a dealer will not deal another round if more than three decks have been used, use the 50% penetration chart to analyze this game.
A six-deck game with less than three decks dealt out is generally a waste of time for card counters, so I have not bothered to analyze such games. Technically, if such a game had exceptionally good rules—such as double on any two cards, double after splits, dealer stands on soft 17, resplit aces and surrender—and/or if you could use a very large betting spread, say 1-to-25 units or more, then a 6-deck game with poor penetration might show slim profits for a card counter.
However, if you look at the pitiful returns on the typical 6-deck games with 50% dealt out, you can imagine what a waste of time it would be to attack a 6-deck game with lesser penetration. Most 6-deck games should be analyzed with the 65% (4 decks) and 75% (4½ decks) dealt charts. An 85% (5 decks) dealt game is way better than an average profit opportunity. Use the 85% chart only for dealers who deal down into the last deck prior to reshuffling. Remember, 85% of six decks is 265 cards. An 85%-penetration dealer would be one who would only shuffle when he had 47 or fewer cards undealt.
How often do you find a shoe dealer who would deal another round with less than a deck remaining in the shoe? Such dealers are extremely rare, but they do exist. Low stakes players are more likely to run into such situations. If you do find yourself in such a situation, use the 85% chart to analyze your possibilities.
Reading the Frequency Distribution Charts
All of the frequency distribution charts are set up identically to the chart on the facing page.
The top line reads: 6 Decks, 75% Dealt, Bet Strategy = 1:2:4:8:16. This tells us that the chart applies to a 6-deck game in which 75% of the cards (4½ decks) are being dealt out between shuffles. “Bet Strategy = 1:2:4:8:16” would apply to a player who is spreading his bets from 1 to 2 to 4 to 8 to 16 units. This might be from $1 to $16, or $5 to $80, $10 to $160, etc.
6 Decks 75% Dealt
Bet Strategy 1:2:4:8:16
Adv.
Hands
Sys 1
Sys 2
Sys 3
Sys 4
Sys 5
Sys 6
Sys 7
Sys 8
-4.5%
0.0
1
1
1
1
1
1
1
1
-4.0%
0.0
1
1
1
1
1
1
1
1
-3.5%
1.0
1
1
1
1
1
1
1
1
-3.0%
2.0
1
1
1
1
1
1
1
1
-2.5%
3.0
1
1
1
1
1
1
1
1
-2.0%
4.0
1
1
1
1
1
1
1
1
-1.5%
8.0
1
1
1
1
1
1
1
1
-1.0%
13.0
1
1
1
1
1
1
1
1
-0.5%
34.0
1
1
1
1
1
1
1
1
+0.0%
13.0
2
1
1
1
1
1
1
1
+0.5%
8.5
4
2
1
1
1
1
1
1
+1.0%
4.5
8
4
2
1
1
1
1
1
+1.5%
3.5
16
8
4
2
1
1
1
1
+2.0%
2.0
16
16
8
4
2
1
1
1
+2.5%
2.0
16
16
16
8
4
2
1
1
+3.0%
1.0
16
16
16
16
8
4
2
1
+3.5%
0.5
16
16
16
16
16
8
4
2
+4.0%
0.0
16
16
16
16
16
16
8
4
+4.5%
0.0
16
16
16
16
16
16
16
8
+5.0%
0.0
16
16
16
16
16
16
16
16
+6.0%
0.0
16
16
16
16
16
16
16
16
+7.0%
0.0
16
16
16
16
16
16
16
16
+8.0%
0.0
16
16
16
16
16
16
16
16
Hands Bet/100:
100
100
100
100
100
100
100
100
Average Bet:
3.05
2.29
1.82
1.46
1.23
1.09
1.03
1.01
Gain/Hand:
0.029
0.022
0.016
0.008
0.003
-0.001
-0.003
-0.004
Win Rate %:
0.95%
0.97%
0.86%
0.58%
0.22%
-0.12%
-0.30%
-0.37%
Units/Hour*:
2.90
2.22
1.55
0.84
0.27
-0.13
-0.31
-0.38
S.D./Hour*:
58.7
46.9
37.9
28.0
19.9
13.6
11.6
11.1
Units/10 Hours*:
29.0
22.2
15.5
8.4
2.7
-1.3
-3.1
-3.8
S.D./10 Hrs*:
185.5
148.2
119.7
88.6
62.9
43.0
36.6
35.0
Units/100 Hours*:
290
222
155
84
27
-13
-31
-38
S.D./100 Hours*:
586.7
468.8
378.6
280.2
198.8
135.8
115.6
110.8
Units/1000 Hours*:
2900
2215
1553
843
270
-130
-310
-375
S.D./1000 Hours*:
1855.4
1482.3
1197.2
886.2
628.5
429.6
365.7
350.4
Units/10000 Hours*:
29000
22150
15525
8425
2700
-1300
-3100
-3750
S.D/10000 Hours*:
5867.2
4687.6
3785.8
2802.3
1987.6
1358.4
1156.3
1108.2
* Based on 100 hands seen/hour, betting the number of hands in Hands Bet/100.
The first column, labeled Adv., shows the various player advantages, positive or negative, in percent. The Adv. column lists the various advantages that will occur in the game from a 4.5% house advantage to an 8% player advantage. If the advantage has a minus sign (such as -3.5%), then it is a house advantage. If positive (3.5%), then it is a player advantage. Neither the house nor the player ever has an advantage greater than +/- 3.5% in this game. With greater penetration (85%), the player sometimes sees an advantage of up to 4.5%.
The second column, labeled Hands, shows how many hands per 100 will occur with the advantage shown in the first (Adv.) column. Example: In this 6-deck game with 75% (4½ decks) dealt, a player advantage of 1% will occur about 4.5 times per hundred hands. A house advantage of 1% will occur about 13 times per 100 hands. Note that the house advantage of 1% occurs three times as often as the player advantage of 1%. This is because we assume the house has a ½% edge (0.5%) off the top. In a sense, the house gets a running start on the player.
Following these two columns, there are eight columns labeled Sys1, Sys2, and so on through Sys8. These columns identify eight different betting systems in which the player is using a 1:2:4:8:16 betting spread.
Note that each of these columns has a list of numbers, all of which are 1, 2, 4, 8, or 16. These numbers are the player bet size in units.
In Sys1, for instance, we see that the player bets 1 unit whenever the Adv. column is negative (meaning the house has an advantage). The player’s bet rises to 2 units when the advantage is 0.0% (dead even), 4 units when the advantage is 0.5%, 8 units with a 1% advantage, and 16 units with any advantage greater than 1%. In all of the betting systems, the player is using a 1-to-16 spread. The only difference in the systems is the advantage at which the player raises his bets from 2 to 16 units.
It is very important that you understand how to read this top portion of the chart, which describes the card counter’s betting strategies in this game. All of the charts in this report, and in all of the Beat the Deck reports, use this same format. Most card counters, if spreading their bets from $5 to $40, would simply say, “I was using a 1-to-16 betting spread.” This may be true, but it is very important to know what your advantage is when you are raising your bets. If you don’t provide this information, you cannot analyze your expectation in the game.
So before you go on to the explanation of the bottom portion of the chart, which provides the analyses of the different betting systems, look at the Sys1 through Sys8 betting strategies and be sure you understand how these betting systems differ from each other. Now, on to the bottom portion of the chart …
The “Hands Bet/100” line shows how many hands out of every 100 hands played the player would expect to place bets. In all of the examples shown in our sample chart, the player bets on all 100 hands. If the player were table-hopping, however, and not betting on negative expectation hands, this would be indicated in the charts by placing zeros in the SYS columns for all of the (dis)advantages where the player placed no bet, and the Hands Bet/100 would be lower than 100.
If you flip forward a few pages, you will see numerous charts where the betting system (Sys) columns have zero (0) entries when there is a house advantage. You will note that in these instances, the Hands Bet/100 reflects the actual number of hands per hundred that the card counter placed bets on.
If your method of play is to not sit down, but to wander from table to table as you play, making your decisions to exit a game whenever your count indicates that the house edge has gone up to a point where you refuse to place even a 1-unit bet, then you will use the chart(s) where the Sys columns show the zero (0) entries. These charts are easily identified by the top line, as the heading will read: Bet Strategy = 0:1:2, or 0:1:2:4, etc.
Just beneath the Hands Bet/100 line, the “Average Bet” line indicates the player’s average bet per hand played in units for each betting system. For instance, in using Sys1, note that the player’s average bet is 3.05 units, while in using Sys8, the average bet is only 1.01 units. This is because the Sys8 player is waiting longer to raise his bets.
Beneath the Average Bet line, the “Gain/Hand” shows how many units the player can expect to win for every hand he plays with each betting system. (If this is a negative number, then it shows how many units the player can expect to lose per hand played over the long run.)
In our sample chart, all of the examples are positive. If you flip forward a few pages, however, you will see many examples where the Gain/Hand is negative, which means that the card counter would actually be losing money in the long run in that game with that particular betting system. This occurs most frequently when the counter’s betting spread is too small to beat a game, especially because the percentage of cards being dealt out between shuffles is insufficient.
Beneath the Gain/Hand line, the “Win Rate %” line shows the player’s long run expectation in percent over the house. (If this were a negative number, it would indicate the house’s advantage over the player.)
Note that with Sys2, the player’s win rate in this game, using a 1-to-16 betting spread, maximizes at 0.97%. Using Sys6, 7 or 8, the player’s win rate is negative. This is because the Sys6, 7 and 8 bettors are waiting too long to raise their bets.
Beneath the Win Rate % line, we find the hard data we’re most interested in: how much money (in units) we can expect to make, on average, for every hour of play, every ten hours of play, every hundred hours of play, etc., in this game. Note that the “Units/Hour” is always based on seeing 100 hands per hour. A crowded table will be slower than this, and if we’re playing heads up, or with only one other player at the table, we’ll probably play more than 100 hands per hour.
Note that if we employ Sys2 for our betting strategy, we can expect to earn 2.22 units per hour, or 22.2 units per 10 hours of play, etc. To get our win rate in dollars and cents, we simply multiply this number by our betting unit. For instance, if my 1-to-16 betting spread is obtained by spreading from $10 to $160, then my unit is $10, and my expected Sys3 win rate in dollars is:
2.22 x $10 = $22.20 per hour.
Note that under each Units/Hour line is a line that shows “S.D./Hour,” “S.D./10 Hours,” etc. S.D. stands for standard deviation. This is the normal fluctuation from your expected win, in units. We can use this measure to estimate how much we should be betting in any given game, with any given betting strategy, based on the actual size of our bankroll.
Standard deviation is a commonly used statistical measure. In our Units/10 Hours example for Sys2, for example, you can see that we have an expected win of 22.2 units, with a standard deviation of 148.2 units. What does this mean?
It means that two-thirds of the times that you play for ten hours (or more precisely 1000 hands) under these conditions, your outcome will be within one standard deviation of your expected win. And 95% (or 19 out of 20) of the times that you play 1000 hands under these conditions, you will be within two standard deviations of your expected win. You will be within three standard deviations of your expected win 99+% of the time. (For a comprehensive discussion on standard deviation, see Blackbelt in Blackjack.)
In our Sys2 example, this means that after 1000 hands, we would expect to be ahead by about 22.2 units. Two-thirds of the time we’ll actually be within one standard deviation (148.2 units) of this expectation. In other words, we’ll actually be somewhere between –126 units and +170.4 units two-thirds of the time.
Ninety-five percent of the time we’ll be within two standard deviations (148.2 x 2 = 296.4 units) of our expected win. In other words, we’ll be between –274.2 units and +318.6 units. And we are virtually assured of always being within three standard deviations (148.2 x 3 = 444.6 units) of our expected win—i.e., between –422.4 units and +466.8 units.
Professional players use the standard deviation to calculate the best betting unit for their particular bankroll. For example, it would be foolish to spread your bets from $5 to $80 in this game if your total bankroll was only $400. Since your expected win per 1000 hands is only $5 x 22.2 = $111.00, and your standard deviation on 1000 hands is 148.2 units, and 148.2 x $5 = $741, a small negative fluctuation within one standard deviation could easily wipe you out.
To spread your bets from $5 to $80 in this game, you would be relatively safe (meaning likely to survive) in ten hours of play if you entered the game with enough of a bankroll to withstand a fluctuation of two standard deviations. Since one standard deviation is $741, and our expected win is $111, a negative fluctuation of 2 S.D.s would set us back $111 – (2 x 741) = $1371. So, a bankroll of about $1400 would survive about 95% of the time (19 out of 20 times).
Using the charts in this report, you will begin to see patterns in the numbers. For instance, every betting spread has an ideal point at which to raise your bet in order to maximize your expected unit win. The more units you bet at positive advantages, the greater your expected win in units, but also, the greater the fluctuations (and the bigger the bankroll needed to survive).
If you lower your top bet substantially in order to cut the flux, however, you may not be able to get a significant edge over the house. The problem you face as a professional gambler is to balance the need to aggressively pursue return (expected win) on your investment dollars with the overriding need to protect your bankroll.
Using the charts in this book, you can quickly see what the return is with any betting strategy along with its associated risk.You can also use these charts to compare profit opportunities when multiple games are available to you. If one casino deals out only 65% of the cards between shuffles, while another deals out 75%, you will almost always find the game with deeper penetration to have greater profit potential.
This may not always be the case, however. If the casino with the deeper penetration is very paranoid about card counters and shuffles up any time a player raises his initial bet by a factor of 4, while the casino with lesser penetration has no problem when you spread from $5 to two hands of $40 ( a 1-to-16 spread), then this more aggressive betting strategy will often compensate for poorer penetration.
Many players are amazed at how severe normal fluctuations can be, even when they have the edge in their favor. It is not uncommon for players who lose heavily in a casino to worry that the game may have been dishonest. Or they may question the validity of the card counting system they are using, or even their own abilities in employing the system.
If you suffer inordinate losses on a trip, you can use the data in these charts to see if your losses fall within the realm of what a mathematician would consider “normal.”
The Technical Appendix, which follows the charts, explains how you can insert any betting system you want into the frequency distribution charts, and estimate your average bet, gain per hand, win rate in %, win rate in units, and standard deviation. If you are competent with a spreadsheet type program like Excel, you could use it to do all of the math for you. But it is not difficult to do it with a simple pocket calculator.
For players who would prefer not to do any math whatsoever, I have analyzed in the charts that follow most of the practical approaches a player might take to attempt to beat the various games.
Also contained in the Technical Appendix, for those who have a greater interest in the mathematics of blackjack, are explanations of how to figure out the standard deviation for any number of hands, using any bet spread. The charts show standard deviation data for 100 hands, 1,000 hands, 10,000 hands, etc., with specifically defined betting systems, but you may want data for 3,200 hands, or 620 hands, or some number of hands not defined in the charts. There is also a simple explanation of how to estimate the effects of playing multiple simultaneous hands.
Now, on to the charts . . .
Note: There are 44 pages of frequency distribution charts in the print version of this book. The charts contain Arnold Snyder’s comments on each betting approach. These charts are not part of this online excerpt.
See the 2005 print version for the full collection of charts. Also, note that the frequency distribution charts for the 1-, 2-, 4-, and 8-deck games are very different from each other and from the 6-deck charts, and each has its own Beat the Deck book.
Technical Appendix: How to Analyze Card Counting Betting Strategies Not Included in this Book (such as Betting Strategies for Big Player Call-In Blackjack Teams)
Let’s analyze one system completely so that you understand the math. We’ll take a 6-deck Las Vegas game with 75% penetration. You may double down on any two cards, including after splits, the dealer stands soft 17.
You leave the table when your advantage falls below –1%, otherwise play $2 off the top and on all other negative and zero expectation hands, $7 if you have a ½% advantage, and $12 if you have a 1% advantage or higher.
Using a sheet of lined paper, label the first column “Hands,” and simply fill in the numbers from the Hands column from any of the “75% Dealt” charts in this report. (Flip back a few pages to look at any of these 75% charts, and you’ll see that the numbers I’ve filled in here match the numbers in the second column of those charts.)
Label the second column “Bet,” and we’ll fill in these numbers later. Leave it blank for now. Label the third column “Total,” and leave this blank also.
Label the fourth column “Adv.,” and here we enter the numbers from any of the 75% charts that are in the Adv. column. Note that this is the first column in those charts. The only difference here is that we should enter the numbers here in decimal format, rather than as percentages. In our 75% charts, we show the advantage as -4.5%, while here we enter -0.045.
If you are unsure how to translate percentages into decimals, the method should be clear to you if you study the numbers I’ve already filled in briefly. Basically, 1% = .01, 1.5% = .015, 2% = .02, etc.
Label the fifth and last column “Gain,” and leave this blank for now also. Finally, place multiplication signs between the Hands and Bet columns, and also between the Total and Adv. columns. And place equal signs between the Bet and Total columns, and also between the Adv. and Gain columns. (See the example below.) Here’s how the first five lines of your template should look:
Hands
Bet
Total Adv.
Gain
0.0x____=____x
-.045
=
____
0.0x____=____x
-.04
=
____
1.0x____=____x
-.035
=
____
2.0x____=____x
-.03
=
____
3.0x____=____x
-.025
=
____
In other words, two of your columns are filled in using numbers from the charts in this book that show the player/house advantage, and the number of hands that occur at each advantage. We are now ready to fill in our betting system, then perform a couple of simple multiplication operations.
If you are familiar with computer spreadsheets, you can see that it would be very convenient to set up a template according to the above guidelines for analyzing betting systems and letting the computer do all the math. But all you really need is a pocket calculator and a sheet of paper. The math is simple once you have the chart set up.
In the second column, labeled “Bet,” you enter your betting system. Since, in this example, we are leaving the table if our Advantage falls below 1%, we enter a zero (0) in this column wherever the Adv. reads negative from -.045 to -.015. We then enter a 2 unit bet for all Advantages from -1% (-.01) to zero, a 7 unit bet where our Advantage is ½% (+.005), and a 12 unit bet for all Advantages of 1% (+.01) or higher. Look at the numbers in the Bet column and see that you understand this process.
To get the numbers in the Total column, we simply perform the multiplication of the numbers in the Hands column and the numbers in the Bet column. To get the numbers in the Gain column, we multiply the numbers in the Total column by the numbers in the Adv. column.Finally, we add up all the numbers in the Total column and the Gain column.This is how it looks when it’s all filled in:
Hands
Bet
Total Adv.
Gain
0.0
x0=0x
-.045
=
0.0000
0.0
x0=0x
-.04
=
0.0000
1.0
x0=0x
-.035
=
0.0000
2.0
x0=0x
-.03
=
0.0000
3.0
x0=0x
-.025
=
0.0000
4.0
x0=0x
-.02
=
0.0000
8.0
x0=0x
-.015
=
0.0000
13.0
x2=26x
-.01
=
-.2600
34.0
x2=68x
-.005
=
-0.3400
13.0
x2=26x
0
=
0.0000
8.5
x7=59.5x
+.005
=
+0.2975
4.5
x12=54x
+.01
=
+0.5400
3.5
x12=42x
+.015
=
+0.6300
2.0
x12=24x
+.02
=
+0.4800
2.0
x12=24x
+.025
=
+0.6000
1.0
x12=12x
+.03
=
+0.3600
0.5
x12=6x
+.035
=
+0.2100
0.0
x12=0x
+.04
=
0.0000
0.0
x12=0x
+.045
=
0.0000
0.0
x12=0x
+.05
=
0.0000
0.0
x12=0x
+.06
=
0.0000
0.0
x12=0x
+.07
=
0.0000
0.0
x12=0x
+.08
=
0.0000
TOTALS:341.5
+2.5175
There is one more operation we need to perform before we have all of the data we need for a complete analysis. We need to know how many hands (per 100) the player is betting on. If we add up the numbers in the Hands column, we see that they sum to 100. The player using this betting system, however, is not betting when the advantage is against him by more than 1%. So, we total up only the numbers from this column where the bet is greater than zero (0). We find: 13 + 34 + 13 + 8.5 + 4.5 + 3.5 + 2 + 2 + 1 + .5 = 82 Hands played
Now, let’s do the math . . .
To calculate your Average Bet per hand, divide the Total Units Bet by the total number of hands played:
341.5 / 82 = $4.16
To calculate your Gain Per Hand in dollars, divide the Total Gain by the total number of hands played:
2.5175 / 82 = .031
To calculate your Win Rate in per cent, divide the Gain Per Hand by the Average Bet:
.031 / 4.16 = .0075 or 0.75%
To calculate your Expected Win in dollars for any number of hands played, multiply your Gain Per Hand times the number of hands played. For example, your expected win after 225 hands played:
.031 x 225 = $6.98
Note: Since we entered the actual dollar amounts of our bets in the Bet Per Hand column, i.e., $2 = 2, we do not have to convert units to dollars. If we were spreading from two $5 chips up to twelve $5 chips, then we would have to multiply our gain per hand in units by 5 to get the gain in dollars.
To summarize, if we played in this 6-deck game, spreading our bets from $2 to $7 to $12, and we played approximately 82 hands of the 100 hands we saw per hour, we would have a win rate of 0.75%, which would give us a long run expectation of about $2.54 per hour (which is simply the 82 hands played per 100 seen times our gain per hand of .031 (@3 cents per hand)).
If we were spreading from two $5 chips ($10) to 7 chips ($35) to 12 chips ($60), our percentage win rate would be identical (0.75%), but our expectation in dollars would be about $12.70 per hour.
Using the above methodology, you should be able to analyze virtually any count-based betting strategy using the frequency distribution data from the charts in this guide.
What if you estimate that the penetration in not quite 75%, but greater than 65%?
You can get a pretty close estimate of your expectation with 70% dealt if you look at the results for 65% and 75%, then calculate the midpoint.
For example, if you look in the charts for using Sys2 with a 1-to-16 spread, you’ll find that your estimated win rate is 0.59% with 65% dealt, and 0.97% with 75% dealt out. If you estimate that in fact about 70% of the cards are being dealt out between shuffles, the midpoint between these win rates is 0.78%.
After you find the midpoint, always round this number down. The actual win rate is between 0.65% and 0.70%.You can use this same method of interpolating results for different levels of penetration for approximating your average bet, your win rate in units per hour, etc.
Standard Deviation in Card Counting
To calculate your standard deviation for any number of hands, this is how you do it:1. First, add up the number of hands played at each of the different bet sizes.
Using the same $2 to $12 sample betting system described above, we placed $2 bets on
13.0 + 34 + 13.0 = 60 hands
We placed $7 bets on 8.5 hands.
And we placed $12 bets on
4.5 + 3.5 + 2 + 2 + 1 + 0.5 = 13.5 hands
2. Now, square each bet size, multiply the bets squared by their respective number of hands played then add up these products. It looks like this: 22 x 60 = 240 72 x 8.5 = 416.5
122 x 13.5 = 1944
2600.5
3. Now, take the square root of this number and multiply it by 1.1, like this: √ 2600.5 x 1.1 = 56.094. Finally, divide this number by the square root of the number of hands played (per 100).
56.09 / √ 82 = $6.19
This is your standard deviation, in dollars, per hand.To find your standard deviation, in dollars, for any number of hands, multiply your standard deviation per hand times the square root of the number of hands.Examples:The standard deviation on 100 hands = 6.19 x √ 100 = $61.90The standard deviation on 1000 hands = 6.19 x √ 1000 = $195.74The standard deviation on 10,000 hands = 6.19 x √ 10,000 = $619.00
Playing Multiple Hands While Card Counting
If you play multiple simultaneous hands, you may estimate your average bet, gain/hand and win rate, by multiplying the number in the Hands column by the number of simultaneous hands you are playing at that count. Do not use this method, however, to estimate your standard deviation.
If, for instance, you play two hands of four units each at a specified true count, you would underestimate the standard deviation if you simply added the extra hands into your calculations. This is due to the fact that simultaneous hands will have more of a tendency to have the same result, since they are both played vs. the same dealer hand.
If you instead estimated your standard deviation as if two 4-unit hands were a single 8-unit hand, you would overestimate the standard deviation. The actual standard deviation would fall somewhere between these two results.
A simple way to estimate your standard deviation on two simultaneous hands is to simply estimate the standard deviation on one hand that is 75% of the total amount bet on both hands. In the example above estimate your standard deviation on two 4-unit hands as if you were playing one 6-unit hand.
For three simultaneous hands, estimate your standard deviation as if you were playing one hand that is 60% of the total amount bet. In other words, with three simultaneous hands of $10 each, take 60% of the $30 total bet and compute your standard deviation as if playing one hand of $18. Again, this is a simplification, but it will give you a good ballpark estimate.
Accuracy
All of the frequency distributions, estimates of win rates, standard deviations, etc., in this book are approximations of what human players might expect in casino play. If the charts herein estimate your win rate at $20 per hour, then your actual win rate is probably between $15 and $25 per hour.
Don’t assume pinpoint precision. Even if we were to run a billion-hand computer simulation to obtain a highly precise estimate for a specific counting system, it would not necessarily provide a better estimate of your expectations in a real-world casino.
Casino dealers vary their levels of penetration. Different numbers of players at the table may affect the shuffle point, change the number of hands dealt per hour, etc. And even the best card counters make errors in “rounding off” their count adjustments and they apply different amounts of betting and playing strategy “camouflage” as needed in the casinos where they play.
Use the data in these charts to compare the profit opportunities in the games available to you, determine the betting spread you need to get a sufficient edge over the house, and estimate your bankroll requirements.
Finally . . .
The top blackjack pros are not all mathematicians, but they all do understand the basic math and logic of the game. If you study the concepts and the charts presented in this book, you will get a very good feel for the profitability of any 6-deck game you find.
Look at the kinds of betting strategies you’ll need to beat the 6-deck games with 50%, 65%, 75% and 85% penetration. If the penetration is poor, look for any possibilities of beating the game by leaving the table at negative advantages. Consider the possibilities of getting a bigger spread by playing two or more hands at favorable counts. If there’s no practical way to get a healthy edge on the house, then keep your money in your pocket.
Frequently Asked Questions
Q: Why do you provide so much information on games and betting strategies that don’t win? Everyone knows that bad penetration and small betting spreads don’t work for card counters.
A: There are many card counters, perhaps even a majority of them, who do not have professional aspirations, but play at moderate to high stakes in order to acquire casino comps. These players are primarily interested in knowing how to reduce the house edge to a break-even point.
Also, there are many books on the market that poorly explain the betting spreads needed to beat various games. Many amateur card counters believe that they can beat most games with a 1-to-4 spread. My goal in providing these Beat the Deck reports to players is to show what works, how well it works, and what doesn’t work, in any game with any level of penetration.
Q: Is there a quick way to use these charts to judge a game’s profit opportunities “at a glance?”
A: I scan the Win Rate % line first, looking for an advantage of about 1% or better. If I find it, I then look at the Hands Bet/100 in that column. The bigger this number, the better. If the Hands Bet/100 is less than 30, you’ll probably spend too much time standing around watching games, waiting to place a bet. The % advantage might be good, but your hourly unit win may be too small to be worth your time.
Q: Do you have any guidelines for judging when the standard deviation is tolerable?
A: The primary guideline is your own personal bankroll. Casino blackjack is a fairly high-risk investment for a card counter. The “long run” often takes a long time coming. A player who is trying to get an edge over the house in the neighborhood of 1% should look for a standard deviation where the expected win after 100 hours (10,000 hands) is at least half of one standard deviation, and after 1000 hours (100,000 hands), the expected win is twice as much as one standard deviation.
In other words, when looking through the charts for a good betting strategy, I look at the unit wins for 100 hours and 1000 hours, compared to their S.D.s. If my unit win expectation is 100 units after 100 hours, then I don’t want the S.D. to be much more than 200 units. If my unit win expectation after 1000 hours is 1000 units, then I don’t want the S.D. to be much more than 500 units. The greater the number of decks, and the worse the penetration, the more difficult it is for a game to meet these criteria.
Q: Why is the Win Rate % higher for one blackjack betting system, when the Units/Hour is higher for another betting system?
A: In a case like this, the system winning more units per hour is betting more units per hour. This may be due to either betting on more hands/100 seen, or increasing to bigger bets at smaller advantages. Whenever you see this, you can find the answer in the chart data by looking at how many hands were bet by each system, and/or at what advantage did each system raise its bets. â™
Did you know the hi-lo index for insurance can be calculated on the back of a (large) envelope? And the right answer for one deck is 17/12, the exact answer for infinite decks is 10/3, and the correct answer for any other number of decks can be found with linear interpolation?
This article attempts to explain and extend the ideas behind Arnold Snyder’s 1980/81/82 pamphlet: Algebraic Approximation of Optimum Blackjack Strategies. The extended system can handle unbalanced true counts, such as TKO or C. Membrino’s true-counted Red 7, as well as balanced counts, such as hi-lo. The formulas developed will be more complicated than Arnold’s, since we have easy access to computers nowadays, freeing us to abandon simplicity in favor of utterly preposterous precision. The system can be used for any strategy decisions where Effects Of Removal (EoRs) are available. In addition, a formula will be given that produces insurance indexes in the form of simple fractions. Â
Even though a lot of number crunching is involved, the ideas behind the system are fairly simple. If you accept the concepts of proportional deflection and linear transformation of a count as espoused by Peter Griffin in Theory of Blackjack (ToB), then you must grant that the results are exact. Unfortunately, this will not stop debate on the subject, since decks of cards aren’t precisely normal in their distribution, making proportional deflection an approximate theory. Therefore, indices generated by simulators may be slightly more accurate. On the other hand, simulators sometimes generate different indexes at different penetrations, and who needs that… ;?
At any rate, from the standpoint of optimizing expectation (as opposed to SCORE) these algebraic indexes are almost certainly “close enough” by any standard. The system cannot provide surrender indexes, because of the way EoR tables have traditionally been layed out.
I’ll try to describe the method in such a way that anyone versed in elementary probability theory and Peter Griffin’s work can understand. Anyone unfamiliar with ToB will assuredly not fully comprehend a word I’ve written. In particular, readers should feel comfortable with the material in Appendix A of Chapter 7 and the reasoning in Appendix C of Chapter 5. You must also know how to use the “Effect of Removal” (EoR) tables in Chapter 6.
To illustrate, I’ll develop a hi-lo insurance index for 1 deck, and a true-counted Red 7 insurance index for 6 decks.
Exact Insurance EoRs
Very precise insurance indexes are possible because we can derive exact insurance EoRs. An EoR is the amount you need to add to the full deck EV to get the EV for the 51 cards after a particular card is removed. I.e. EV(full deck) + EoR = EV(51 cards with specified removal), or rearranging: EoR = EV(51 cards with specified removal) – EV(full deck). [Note: EV stands for “Expected Value,” also known as the expectation, in this case for the insurance bet only.]
Let’s find the insurance EoR for a 5. Assume a 1 unit bet …
There are 16 Ts in a full deck and 36 other cards. If the hole card is a T, you win 2 units, if the hole card is one of the others, you lose 1 unit. EV(full deck) = 16/52 * 2 + 36/52 * (-1) = -1/13
If you remove a 5, there are now 51 cards remaining, still with 16 Ts, but now with 35 others. EV(51 cards with a 5 removed) = 16/51 * 2 + 35/51 * (-1) = -1/17
So EoR(5) = (-1/17) – (-1/13) = 4/221 ~= 1.81% as listed in ToB, Chapter 6. 4/221 is the exact EoR, 1.81% is the decimal approximation.
This same EoR = 4/221 can be used for any number of decks, and any number of cards removed, by using the conversion outlined on the fourth page of Chapter 6, ToB, of 51/(cards remaining). For example, if you remove four 5s from a 6 deck shoe, the total EoR will be: 4*(4/221) * 51/308 = 12/1001
(We’re already at the point where a CAS — Computer Algebra System — program or calculator would come in handy. Examples of CAS are Mathematica, Maxima, XCas, and calculators like the HP 50g or TI-89. You could also do the fractions by hand or throw caution to the wind and take my numbers on faith.)
Since we know the EV at the top of a 6d shoe is -1/13, we know now that the EV after four 5s are removed is: -1/13 + 12/1001 = -5/77, which you can verify by exact calculation. You should calculate the expectation directly, based on four non-tens removed from a 6d shoe, to convince yourself that -5/77 is the exact answer (ignore the ace upcard, just this once). If you can follow the argument thus far, you have a very good model of how EoRs work for any strategy — not just for insurance. On the other hand, if you can’t calculate the -5/77 EV, you probably should get off now. This article may exceed your level of preparation.
Using reasoning similar to the above, we can show that the insurance EoR for any card other than a T is also 4/221, and that the EoR for a T is -9/221. The EoR for all 52 cards sum to zero.
You can multiply all the insurance EoRs by 221 (a “linear transformation”) to get the perfect, balanced, insurance count of 4 4 4 4 4 4 4 4 4 -9 (we follow Griffin’s convention of listing tags in numerical order from ace through ten). This is Thorp’s old ten count “parameterized as a point count,” according to ToB Chapter 4. These tags have correlation 1 with the EoR, so if you know how to calculate a correlation coefficient, you can quickly correlate any set of tags with 4 4 4 4 4 4 4 4 4 -9 to get the an Insurance Correlation (IC) slightly more accurate than what you get using the three digit EoRs listed in ToB.
A Word about Linear Tranformations of Count Systems
A linear transformation is the act of adding a constant to every tag in a count, and/or multiplying every tag by a constant. A little thought should convince you that linear transformations will not degrade the information accessible to an arithmetically adroit counter. For example, suppose all the hi-lo tags were doubled. You of course will adjust your indices and your betting ramp, but at the end of the day, you’re no better nor worse off than when you started. Similarly, you could subtract 1 from all the tags, and unbalance the system, but at any given penetration, it’s trivial to convert back to hi-lo.
In general a set of tags, tagi for i = 1, 2 … 52, defines a count system S. If, for constants m and b, each tagi(S) x m + b = tagi(S’), then system S’ has linear equivalence with S. The correlation between S and S’ will be 1, so the correlation between either count and any third set of numbers (such as EoRs) will be identical.
For example, if S = hi-lo, m = 2 and b = 0.5, we have the count:S’ = -1.5 2.5 2.5 2.5 2.5 2.5 0.5 0.5 0.5 -1.5which is nothing more than hi-lo in disguise. This new count has an insurance correlation of 76%, same as the insurance correlation for hi-lo. Betting and playing correlations are also identical.
Or if you let m = 3/13 and b = 1/13 and apply those to Thorp’s ten count, S = 4 4 4 4 4 4 4 4 4 -9, you generate S’ = 1 1 1 1 1 1 1 1 1 -2 — the well known Noir count. Both counts have an insurance correlation of 100%.
Calculating the One Deck Insurance Index for Hi-lo
Now to find the one deck insurance index for hi-lo, we will perform six steps:
1) Remove the upcard (ace for insurance) as well as any player cards from the shoe.
2) Balance the count with a linear transformation.
3) [A common sense check] Find the effect on each individual card when you increase the TC by 1, using the principle of proportional deflection.
4) Write an equation that finds EV (or EV delta, for strategies other than insurance) based on 3), the EORs, and the starting EV delta.
5) Solve for TC when the EV delta = 0.
6) Convert this TC back to the original count to get your index.
As follows:
1) Remove an ace from the deck. Now there are 51 cards in the “shoe” and the count is no longer balanced.
2) To balance the count (making it much simpler to apply Griffin’s formulas) we need to subtract 1/51 from all the tags, which is a linear transformation. So we have three aces that are tagged -52/51, four each of 2s, 3s, 4s, 5s and 6s tagged 50/51, four each of 7s, 8s and 9s tagged -1/51, and 16 Ts tagged -52/51. If you add the tags all up, they come to zero, and this new count is guaranteed to provide the counter with exactly the same information as a hi-lo counter at any given penetration.
We could multiply all the tags by 51, to make them integers, and it might make life a little easier for a counter using the tags in the real world, but our calculations would actually become slightly more complicated — and we have no intention of using these tags in the real world.
A subtle point concerning this new “rebalanced” count is how it reveals that the so-called “neutral” cards are now slightly correlated with the count. Because of the removal of the ace, a high count now portends a slightly higher density of 7s, 8s and 9s, and you can see that clearly now, because 7s, 8s, and 9s have small negative tags.
3) For each of the 51 cards, a given True Count (TC) implies that, on average, it’s no longer one, single card. The principle of proportional deflection says it’s “deflected” from the value of 1, by an amount proportional to the TC and the individual (balanced) count tag. Under this model, cards can assume fractional values in order to represent the “average” situation. The formula for each card is: 1 – TC*Tag*51/y, where y = the sum of (tag squared) for all 51 cards. TC, here, is expressed on a per card basis, so a TC of +1/52 corresponds to a full deck TC of +1. There will still be 51 cards in the deck after the cards are deflected. In this case:y = 19*(-52/51)^2 + 20*(50/51)^2 + 12*(-1/51)^2 = 1988/51.
If we happen to have a per-card TC of +1/52, the average number of aces would then be:
3*[1 – (1/52)*(-52/51)*51/(1988/51)]
And the average number of Ts would be:
16*[1 – (1/52)*(-52/51)*51/(1988/51)]
And the total number of 2s, 3s, 4s, 5s and 6s, on average:
20*[1 – (1/52)*(50/51)*51/(1988/51)]
And the total number of 7s, 8s and 9s, on average:
12*[1 – (1/52)*(-1/51)*51/(1988/51)]
If you add these all up, you’ll find the total number of cards is still 51. It’s also fairly easy to prove that it adds up to 51 for any TC you choose to plug in. Finally, if you use any number other than 1988/51 for y, the sum of tags for the 51 cards would no longer equal -51/52 as it must, showing our deflection formula is accurate. (Please take your time to verify each of these statements. You don’t have to take my word for anything.)
4) Now the nice thing about the 1d insurance case is we don’t have to convert EoRs. If we use our TC formula for the various cards, the number remaining at the end will be 51. So we’ll forego multiplying by 51/51 (but we must remember to convert when we tackle 2 or more decks). The number “removed” for each of the 51 cards, is just the expression after the “1 -“. E.g. for one given ace, remember the formula was:
1 – TC*(-52/51)*51/(1988/51)
So the amount removed for that ace is:
TC*(-52/51)*51/(1988/51)
which will be a negative removal (i.e. an addition) for a positive TC. So combining all this information with the EoRs, and remembering to add the full EoR for the ace removed off the top, we deduce that the total insurance EV, for any given TC is:
EV = -1/13 + 4/221 +
3*TC*(-52/51)*51/(1988/51) *(4/221) +
20*TC*(50/51)*51/(1988/51) *(4/221) +
12*TC*(-1/51)*51/(1988/51) *(4/221) +
16*TC*(-52/51)*51/(1988/51) *(-9/221)
This is similar to the example on the fourth page of Chapter 6, ToB, showing how to use the EoR tables, just a tad more tortuous for us mathe-masochists.
5) To find the per-card index for our “rebalanced” count, we set this last expression equal to zero and solve for TC:
for TC also returns 497/10608. This last expression just says there are twice as many non-tens as tens in the deck. So you can solve for TC without bothering with EoRs, just from the expression for Ts in 3). But I’m using EoRs to develop a more general system that can handle any strategy — not just insurance.]
6) Now to get the hi-lo per-card TC from this “rebalanced” count at any penetration, all you need to do, believe it or not, is subtract 1/51. Then to convert to a full deck TC, you multiply this per-card TC by 52. So our surprisingly simple answer is: index = (497/10608 – 1/51) * 52 = 17/12
There are other ways to look at it and think about it, but the blankety-blank mess above is about as simple as it gets for this problem. Luckily we live in the age of computers, and once you codify the six steps in a program or spreadsheet you can immediately harness the system to kick out an index for any strategy decision.
Second Illustration — Six Deck Insurance Index for Red 7
We follow the same six steps…
1) Remove an ace from the shoe. Now there are 311 cards in the shoe and the count is (still) unbalanced.
2) To balance the count we subtract 13/311 from all the tags, which is a linear transformation. So we have 23 aces that are tagged -324/311, 24 each of 2s, 3s, 4s, 5s and 6s tagged +298/311, 12 red 7s tagged +298/311 and 12 black 7s tagged -13/311, 24 each of 8s and 9s tagged -13/311, and 96 Ts tagged -324/311. If you add the tags all up, they come to zero, and this new count is guaranteed to provide the counter with exactly the same information as a Red 7 true counter at any given penetration.
3) The sum of squares:y = 119 * (-324/311)^2 + 132 * (298/311)^2 + 60 * (-13/311)^2 = 77892/311
The formula for each card is:
1 – TC*Tag*311/y
The rest of this step is left as an exercise for the interested reader.
4) The amount removed for one of the 23 remaining aces is:TC*(-324/311)*311/(77892/311)which will be a negative removal (i.e. an addition) for a positive TC. Combining this type of information with the EoRs, and remembering to adjust EoRs by a factor of 51/311 while remembering to add the full EoR for an ace off the top, the total EV for a given TC is:
[Note: This is the full deck true counted Red 7 insurance index assuming the normal Initial Running Count (IRC) for 6 decks of -12. People using another IRC will need to adjust the index accordingly.]
Comparison with Arnold Snyder’s Algebraic Method
In the case where only the dealer’s upcard is removed from one deck the 6 steps can be collapsed to this formula:
index = 52/51 * m * y / i + 52/51 * t
Where m is precisely as the Bishop defined it, namely:
-(Griffin’s 11th column) – EoR(upcard).
And y is the same as Snyder’s “p” except it is based on sum of squares of the modified tag values, instead of the original tags. And i is the inner product, also based on the modified tags instead of the originals.
Hence, the famous Algebraic Approximation formula:
index = m * p / i + 52/51 * t
is closely related to the procedure above. After calculating a dozen different strategies using Griffin’s most recent EoR tables, I can state that indexes calculated with 1) through 6) above rarely differ from Snyder’s until you reach the third digit, which is insignificant to the player’s hourly win rate.
For example, the one deck insurance index under the ’81 system for hi-lo is 1.4332 while the method above yeilds 1.4167 . It’s impossible to construct a set of cards with 51 cards or less having a TC that falls between those two numbers. Thus, the two indexes are in practice indistinguishable.
The main value of this new procedure is in expanding the algebraic system to unbalanced true count systems. In addition you get to entertain your friends with insurance indexes in the form of exact fractions!
Toward Perfect Insurance — A Challenge
Griffin states that “insurance is linear,” and we know EoRs can provide exact expectations for insurance bets, and that perfect insurance decisions are possible with the Noir count, for example. So it might seem reasonable to expect our procedure to provide exact insurance indexes, or at least the best possible index for every situation. Unfortunately, for most counts this is only possible when you have different indexes at very deep penetrations.
To see this, imagine one deck dealt down to the last two cards — one of which is the dealer’s hole card. A hi-lo count of zero portends the following probabilities for the two unseen cards: ace-low — 60/446, ten-low — 320/446, mid-mid — 66/446. So an insurance bet of one unit has a positive expectation of +17/223, and a simulator should produce an index of 0 in this extreme situation.
However, I challenge anyone to find an insurance index that’s more accurate than the one generated by this algebraic system, for any popular count system throughout a realistic range of penetrations.
Extension to Unbalanced Running Count Systems
Several simulations lead me to this rule of thumb: the best index for a conventional unbalanced count, such as Red 7, is the running count corresponding to the true count index at the most profitable penetration that can actually occur in the game you’re facing — or just under that penetration. The most profitable penetration normally occurs right before the last hand is dealt from the shoe.
For example, for Red 7, since the six deck true counted insurance index is 2015/1944, I would look at the RC where 4.5 decks had been dealt out and calculate: RCindex = 1.5 x 2015/1944 = 1.55, which dovetails nicely with Snyder’s recommended index of +2 for shoe games. Again, this is the index when you start with an initial running count of -12, which is necessary to make the final running count equal to zero once all the cards in the shoe are counted.
Simulations might reveal some exceptions to my empirical rule of thumb.
General Linear Insurance Index Formula:
Applying our six steps to the general insurance case and simplifying (don’t try this at home) yields:
index =
52 (d s s – 48 d s t – 4 d y + 4 a a + 48 a t – 2 a s + y) ————————————————————— 48 (52 d t – d s + a – t)
where d = number of decks, a = ace tag, t = tens tag, s = sum of all tags over one 52-card deck, and y = sum of squares of tags over one deck.
For example, for Red 7, 6 deck case, d = 6, a = -1, t = -1, s = 2, and y = 42. So index =
It is assumed, for unbalanced counts, that an IRC of -s*d is used, so true count division can work normally.
Changing the 52 in the numerator to 26 gives the half deck index; changing it to 13 gives a quarter deck index; and changing it to 1 returns a per-card index.
Some interesting conclusions can be drawn from the General Insurance Index Formula. For example, you can use it to show that when a = t (as in hi-lo or Red 7), interpolation by reciprocal of the number decks (1/d) is possible. Interpolation can also work perfectly in some other cases, e.g. for any perfect insurance count, such as 0 0 0 0 0 0 0 0 0 1. a = 0, t = 1, s = 16, y = 16, and index = -52/3, for any d save d = 1/36 (which is unlikely).
The GLII formula works for insurance, but for other strategies I revert to the six steps above. A formula would get pretty hairy.
Here is an input string for the formula that you can copy and paste for use in some CAS programs:
[Acknowledgements: This paper was originally published in 1980. Subsequent correspondence with a number of blackjack experts–notably Stanford Wong, Ph. D., Peter Griffin, Ph. D., and Bob Fisher–has led me to revise the original formula and some of the original recommendations.]
The currently employed methods of computing playing strategy indices involve high-speed computers and complex programs based on the intricate complexities of probability mathematics. Though it has been almost 20 years since the first such programs were written, there is still disagreement among experts as to the most accurate methods of approximation.
To approximate blackjack strategy tables, I will take an algebraic approach, which is far simpler than computerized methods. It has been shown that playing strategy indices cannot be accurately determined by linear methods, but it is also true that current computerized methods are imprecise approximations. I am not convinced that current computer methods are more accurate than algebraic methods, in conjunction with certain linear assumptions.
My calculations are based on Peter Griffin’s Theory of Blackjack1. Anyone unfamiliar with this work will assuredly not fully comprehend my methods. I will refer often to Griffin’s methods, and will neither redefine nor explain those concepts that Griffin presents so clearly in his book. I do not mean to imply that Griffin in any way suggests in his book that his calculations be used as I will use them. The theories and methods herein are my own.
For those who are unfamiliar with computer methods of obtaining strategy indices, consider the math involved in computing a single hit-stand decision. Assume a single-deck game, Vegas Strip rules, with the player holding a total of 16 versus a dealer upcard of ten (hereinafter, any 10, J, Q, K will be written “X”). The player is using the Dubner (Hi-Lo) count to keep track of the cards, and wishes to know at which “true count” or “count-per-deck” standing becomes the preferred strategy.
In this counting system, 2s, 3s, 4s, 5s, and 6s are assigned a value of +1 as they are removed from the deck. Ten-valued cards and aces are counted as -1. Count-per-deck is defined as the running count divided by the fractional proportion of one deck remaining. The first consideration in solving this problem is to realize that a player total of 16 may be composed of any number of different combinations of cards. There are, in fact, 145 different combinations of cards which would total hard 16 (See Chart #1, Appendix).
Naturally, one would be more likely to be holding a combination of X-6 than a hand of A-A-A-A-2-2-2-2-4. In fact, if dealt in that order, one would simply have split A-A. In another permutation, one would surely have stood on A-2-A-A-A-2-2, a soft 20, and the decision of how to play such an unlikely 9-card total of 16 would not have presented itself. Dealt as in the table, 4-2-2-…-A, one might conceivably face this decision.
It must be determined which of these 145 combinations are relevant to the decision, according to all the rules, procedures and options of the game. For each of these specific hands, one may determine the precise advantage of hitting or standing simply by considering the outcome of every possible series of player and dealer draws (and down-cards). By properly weighting each of these possibilities, according to how probable each hand and series of events is, one would determine basic strategy for the decision of whether to hit or stand on 16 versus X.
The amount of math involved in any single basic strategy decision is vast. That a highly accurate basic strategy was originally, and painstakingly, computed on crude adding machines is phenomenal.2 Naturally, short-cuts were taken in devising this strategy. Computers made possible precise calculations, but even after decades of mathematical research, there is still dispute over basic strategy.
Julian Braun3 says to split 2-2 vs. 3 in the single-deck game. Stanford Wong4 says to hit. Both are highly qualified mathematicians and computer programmers. Peter Griffin has computed the only 100% accurate single-deck basic strategy, but this strategy has not yet been published. Dr. Griffin informs me that on this particular strategy decision, Braun’s recommendation to split 2-2 vs. 3 is the correct play.
We still have not considered the calculations involved in computing the count-per-deck at which a Dubner count system player would stand on 16 vs. X, rather than follow basic strategy. There is a precise mathematical method which will accurately determine this index number. One need only determine all of the possible deck compositions which would indicate each of the various true counts, then compute the expectation from every possible series of draws, down cards, etc., for each relevant combination of cards totaling 16, weighting each outcome to reflect its probability. The enormity of this task prohibits its being carried out, even by computer. The cost of computing such accurate indices far exceeds any card counter’s, system seller’s, or casino’s stake in the game.
Again, mathematicians have resorted to short-cuts. Rather than analyze every possible deck composition that would indicate each specific count-per-deck, the accepted method is to analyze carefully chosen representative deck compositions. The accuracy of indices so determined is dependent on how closely the chosen decks reflect true probability.
There exist now, and have always been, differences of opinion regarding the best method of choosing a count representative deck. Lawrence Revere,5 it has been pointed out by Julian Braun,6 erred in failing to remove neutral (0 value) cards when composing his deck subsets. The computer-derived indices, therefore, were all based on decks with an abnormally high proportion of neutral cards. A similar error had been made by Braun years earlier in composing decks for the Dubner count indices for Thorp’s 1966 Beat the Dealer.7 Braun later corrected this error, and recomputed these indices, the corrected version of which appear in his How to Play Winning Blackjack.
Stanford Wong’s indices for the Dubner count differ from Braun’s. Wong argues that his method of choosing a representative deck will produce more accurate indices.8 Griffin points out the inherent limited accuracy of determining indices by using these artificially composed decks to represent all possible deck subsets. To quote Griffin, “…even the most carefully computerized critical indices have an element of faith in them.” What Baldwin, et al., once did with adding machines to determine basic strategy is now being done with computers to determine playing strategy indices.
A simpler and, I believe, equally accurate approach, would be to precisely compute one set of strategy tables, by which any counting system could be measured, and indices calculated. On pages 74 to 85 of Theory of Blackjack, Peter Griffin provides the precise information necessary to calculate such indices by algebraic methods.
The formula is simple. Divide the favorability of no action (i.e., not hitting, not splitting, etc.) by the total effect of the count-valued cards. To obtain count-per-deck, simply multiply this by the sum of the squares of the points counted per deck. One thus obtains the critical index at which the action pivots from favorable to unfavorable, or vice-versa. (One must also account for the sum of the removed card(s)’ point values, and adjust this count to reflect “count-per-deck”).
The complete formula looks like this:
(mp/i) + t = count-per-deck for altering strategy
m = “mean” or “favorability”, which Griffin presents in the eleventh column of his tables. It is necessary to reverse the sign (+/-) of Griffin’s “mean”, since he is quoting the favorability of making the action (hitting, etc.)
p = sum of the squares of the “points” counted per deck. Example: for the Dubner system, counting +1 for 2, 3, 4, 5, 6; and -1 for X and A; p=40. For Hi-Opt II counting +1 for 2, 3, 6, 7; +2 for 4, 5; and -2 for X; p=112. Simply multiply the sum of the squares of the points of the 13 different cards by 4 (for the four suits).
i = the “inner product” of the count system’s point values and the effects of removal. These effects are listed in Griffin’s tables. He also explains the method of calculating the inner product (p. 44).
t = the sum of the point values of the removed cards, adjusted for “true” count-per-deck.
(This formula is identical to the one in my original paper except that here I recommend p = the sum of the squares of the point values. In the original paper I recommended p = the absolute sum of the point values. For any level one count, such as the Dubner/Hi-Lo count, as will be shown in this paper, either valuation of p will produce identical indices, since 12 = 1. The few discrepancies between the charts in this paper and those of the original paper are due to slight computational and typographical errors in the original charts, discovered by Bob Fisher.
I also published a correction sheet for the first paper, which advised multiplying by “a”, where a = the average point value of a counted card. With the new formula, this methodology is not advised. Both the original formula and this revised variation of it will produce identical indices for level one count systems and nearly identical indices for higher-level counts.
The new formula was developed by considering how the formula might best be applied to determining insurance indices, using Griffin’s data on page 71 of Theory of Blackjack. Stanford Wong, who originally questioned the formula’s validity for higher counts, pointed out to me that insurance indices were optimally calculated according to Bayesian principles, multiplying the point values of the various cards by their respective probability of being drawn. This inevitably produces a weighted count in which the ratio of the count values to one another is identical to the ratio of the respective values if all values of the count were simply squared.)
Example: 14 vs. A, single-deck, dealer stands on soft 17, using the Dubner count:
m = 18.85 (from column 11, p. 74, Theory of Blackjack)+ .44 (effect of removal of dealer’s ace, p. 74, Theory of Blackjack)= 19.29 (+/-) = -19.29
p = 40 (sum of the squares of the Dubner point values)
i = -57.44 (Using the effects on p. 74, this figure is calculated for the 39 remaining point-valued cards, the dealer’s ace having been removed.)
t = -1 (count-per-deck will be calculated according to a 51-card deck. With only the dealer’s upcard removed, t will simply equal the point value of this card. To obtain a true count-per-52-card-deck, the single-deck index values, as per this paper, should be multiplied by 52/51 to account for the removal of the dealer’s upcard. For the sake of simplicity, I have neglected this step, which is of minor practical significance to the player, whose count-per-deck approximations would be rounded to the nearest whole number anyway.)
Solving the formula:
(mp/i) + t = ((-19.29 x 40)/-57.44) + (-1) = 12.4
Thus, a player using the Dubner count should stand with a total of 14 versus ace at a count-per-deck of +12.4. On page 137 of How to Play Winning Blackjack, Julian Braun gives this index value as +12. On page 169 of Professional Blackjack (1980), Stanford Wong gives this index as +13. To demonstrate the effectiveness of this simple formula, I will produce all 38 hit-stand indices that Braun records on page 137 of his book. Wong’s indices are on page 169 of his 1980 edition. So that my work may be easily checked, I will provide the single-deck values for m and i, with the dealer upcard removed, calculated as previously explained. In all cases, p = 40 and t = the point value of the dealer’s upcard. Dealer stands on soft 17. (See charts #2 and #3, Appendix.)
Inserting the corresponding values from Charts #2 and #3 into the formula and solving, the complete single-deck hit-stand strategy table looks like this:
2
3
4
5
6
7
8
9
X
A
17
-8.2
16
-9.0
-10.4
-12.2
-13.5
-13.6
9.9
8.6
4.1
0.0
6.9
15
-5.4
-6.8
-8.5
-9.6
-9.8
11.6
10.9
7.2
4.2
8.2
14
-3.1
-4.6
-6.3
-7.5
-7.7
15.4
12.4
13
0.0
-1.4
-3.2
-4.8
-4.5
12
4.5
2.7
0.6
-1.0
-1.3
Comparing these indices to Braun’s, my table as a whole is quite similar. Only 4 of the 38 algebraic indices differ from Braun’s by more than 1 point, when the algebraic indices are rounded to whole numbers. All of these major differences are between double-digit indices, so are not highly significant from the standpoint of player expectation. Many players do not even memorize double-digit indices. Comparing the algebraic indices to Wong’s, again the table is remarkably similar. If I round all algebraic indices to the nearest whole number, only one index value differs by two points. This index value is for 16 vs. 6, for which Wong gives -12, and which the algebraic formula determines to be -13.6.
I will point out that this formula will produce index values for some decisions for which no index value actually exists (such as, with this count, 14 vs. X). Such index values will for the most part be double-digit indices that would not contribute to any notable loss of profit because of their rare application.
Consider the problem of “rounding” indices to whole numbers. Griffin has noted that this practice may introduce up to a 10% error in playing decisions. Few players can estimate a count-per-deck within fractions of a point, so indices are recorded as whole numbers. It I take liberty in rounding the algebraic indices to whole numbers in Wong’s “direction,” be it up or down (so that -13.6 may be rounded to -13), only 9 of the 38 algebraic indices differ from Wong’s by one point, while the other 29 are the same. Note that Braun and Wong differ on 16 of these indices, four of them by 2 points.
One of Wong’s points of contention with Braun’s methodology is that Braun used linear methods (interpolation and extrapolation) to determine his four-deck strategy. Ironically, Braun’s and Wong’s four-deck strategies more closely resemble each other than do their single-deck strategies, where Braun’s indices are not linear based. In his newsletter, Wong presents evidence that his methods of choosing his representative deck subsets are more accurate than Braun’s.
Wong’s arguments appear logical, but I have made no thorough comparative examination of their methods. Likewise, I would not recommend a player use algebraic indices instead of the computerized indices of a qualified expert like Julian Braun or Stanford Wong. I make no claim for the “superiority” of the algebraic formula. It would be of practical use to a player who desired to play a count for which reliable strategy tables were not available, or were incomplete, or were available only at a price the player did not wish to pay.
In any case, considering the extreme approximation technique of creating a “most likely” deck with a double-digit true count, I see no mathematical argument that -12, as per Wong, would be more accurate than the algebraically determined -13.6, and this is the most radical difference between any of Wong’s and the algebraic single-deck hit-stand indices. What most surprises me is that a simple algebraic formula would so closely mimic the results of simulation-based data. (Readers familiar with The Blackjack Formula9 will note that I am essentially doing “more of the same” to determine indices as I did to determine profit potential. Having reduced the problem to the fewest number of variables, I make simple algebraic assumptions.)
To demonstrate the uncanny precision of this mimicry, we may use the algebraic formula to determine indices for specific player hands versus dealer upcards. Wong, on page 171 of his book, quotes the “two-card” hit-stand indices for the single-deck game, dealer stands on soft 17. For instance, both Wong’s computer and the algebraic formula determine the critical index for 13 vs. 2 to be 0. However, Wong’s “two-card” table shows that for the player holding X-3 or 9-4, the correct index is +2. With 8-5 the index is -2. With 7-6: -3.
Applying the formula to X-3 vs. 2:
m = -1.28
(from Griffin, Theory of Blackjack,p. 85
-0.22
(dealer’s 2, p. 85)
+2.44
(player’s X, p. 85)
-0.16
(player’s 3, p. 85)
As per Griffin (p. 86, Theory of Blackjack)
m = -1.28 + 51/49 (-0.22 + 2.44 – 0.16) = 0.86
Reverse sign (+/-): m = -0.86
i = -55.94 (after removing dealer’s and player’s cards)
t = 51/49 (+1 -1 +1) = 51/49
Solving the formula:
(-.86 x 40/-55.94) + 51/49 = 1.66
Similarly, solving each of the “two-card” player hands, and comparing the results to Wong’s:
Wong
Formula
Any 13 v. 2
0
0.0
X-3 v. 2
+2
+1.7
9-4 v. 2
+2
+1.7
8-5 v. 2
-2
-2.5
7-6 v. 2
-3
-2.6
When comparing the algebraic results to hundreds of Wong’s player “total” and “two-card” indices, including pair-splitting and double-down decisions, for both Wong’s Hi-Lo and Halves counts (where p=44, and all values for both i and t were recalculated), where dealer both hits and stands on soft 17, the single-deck algebraic strategy is so similar to Wong’s, it would take a computer simulation of many millions of hands to determine which strategy is actually superior.
The algebraic formula proves even more precise in mimicking computer-produced indices for multi-deck games, simply by weighting the removed cards according to diminishing effect. It is most convenient to simply calculate an infinite-deck strategy, and interpolate indices using the reciprocal of the number of decks (see Griffin, Theory of Blackjack, p. 115 and 127). There is very little difference between interpolated indices and indices calculated for the specific number of decks by the algebraic formula. Approximation of infinite-deck indices is quite a bit easier than calculating single-deck indices. The formula becomes simply mp/i, since t is irrelevant to an infinite number of decks.
m = Griffin’s 11th column figure (=/-), with no adjustment for upcard removal
p = 40 (Hi-Lo count)
i = inner product of all 40 count-valued cards
We may further simplify by valuing p=10, and calculating i on the basis of each of the 10 different counted cards. These values are in Chart #4, Appendix.
Example: 14 vs. A
m = -18.85 (from Theory of Blackjack, p. 74, +/-)
p = 10
i = -14.47 (from Chart #4)
Solving mp/i = (-18.85 x 10)/-14.47 = 13.0
I will point out here that an “infinite deck” is not only an impossibility, but that if one were keeping a running count of cards as they were removed from an infinite number of standard 52-card decks shuffled together, one’s efforts would be pointless since “true” count would inevitably always equal 0. The term “infinite deck” is used simply to mean that the removal of any one card (dealer’s upcard, in this example) will not in itself alter the ratio of the various cards to one another.
An infinite deck with a true count of +13, as per this example, means that an artificial deck would have to be created by removing “low” cards and adding “high” cards proportionately to obtain an “infinite” deck in which the ratio of the “low” cards to “high” cards would indicate that for every 52 cards in the deck, an average count of +13 would be the sum of the assigned point values. In such an artificially composed deck, one’s optimum strategy would be to stand on 14 v. A. In multi-deck games, infinite deck strategies are quite accurate, since the removal of any individual card has far less effect on deck composition than in a single-deck game.
The complete infinite-deck hit-stand table looks like this:
2
3
4
5
6
7
8
9
X
A
17
–6.9
16
-9.1
-10.3
-10.8
-12.0
-13.7
7.4
5.9
3.6
-0.6
7.9
15
-5.9
-7.1
-7.8
-8.8
-9.5
9.4
8.6
6.7
3.3
8.9
14
-3.9
-5.2
-5.8
-6.9
-7.5
17.4
13.0
13
-0.9
-2.2
-3.1
-4.5
-4.6
12
3.6
1.9
0.5
-1.2
-0.6
Comparing these indices with Wong’s 4-deck indices (p. 173, Professional Blackjack, 1980 ed.), rounding to the nearest whole number, no algebraic index differs by more than one point. Note that 20 of Wong’s single-deck indices change when computing for 4 decks, fifteen by 1 point, three by 2 points, and two by 3 points. The algebraic formula follows Wong’s changes with notable precision. When I interpolate a 4-deck strategy chart (or compute a 4-deck chart by properly weighting the effects of the cards-either method producing almost identical results), the 4-deck indices are slightly closer to Wong’s 4-deck indices than are these infinite-deck indices, and no algebraic index differs from either Wong’s or Braun’s 4-deck indices by more than one point.
Griffin’s tables may thus be used to determine a highly accurate strategy for any balanced point-count system, for any hit-stand, hard or soft double, or pair-splitting decision, for any number of decks, for both soft 17 rules. Using the surrender data Griffin provides on pages 121 and 122 (Theory of Blackjack), one may easily calculate the favorability (m) for both early and late surrender; however, Griffin does not provide the effects of removal for either surrender option. It is not correct to use the effects on pages 74-85 to calculate the value of i for surrender decisions. The effect of removing any card on one’s hit-stand or pair-splitting decision, for which Griffin supplies data, will naturally differ from the effect of removing that same card on one’s surrender decision.
One other common rule variation for which data is not supplied in Griffin’s tables is pair splitting when doubling after splitting is allowed. Nor does Griffin provide data on splitting X-X vs. upcards of 4, 5 or 6-occasional plays for single-deck players. Dr. Griffin has informed me that the effects of removal for doubling down on A-9 vs. 4, 5 and 6, respectively, may be used to calculate these indices with notable accuracy. For the most part, Griffin’s “Virtually Complete Strategy Tables” are aptly titled.
Over a weekend, using a programmable calculator, I devised relatively complete 1, 2, and 4-deck strategy tables for Hi-Opt II, Revere’s Point Count, and Uston’s Advanced Count. It was somewhat tedious, but consider the time and money required for computerized methods. I believe these indices to be as accurate as any devised to this point for any point count system.
I will not, by the way, make these strategy tables commercially available. In my opinion, no serious player should be without Griffin’s book, which is all one needs to compute such tables. The calculations I have explained are not difficult. One need not comprehend the more advanced math of Griffin’s appendices to produce strategy tables according to the algebraic method, though again, I will emphasize, one would need Griffin’s book to understand and apply the methods I am proposing.
A few fine points: any time the inner product (i) is negative, as in all hit-stand decisions, the critical index for changing strategy will indicate the point at which the action (i.e., hitting) pivots from favorable to unfavorable. Any time i is a positive number, as in most double-down and pair-splitting decisions, the critical index will indicate the point at which the action pivots from unfavorable to favorable. Chart #4 (Appendix) also gives the values of i for the most common doubling and pair-splitting decisions, Dubner count, p = 10, infinite deck.
Note that of the 37 most common double-down and split decisions, only 8-8 vs. X has a negative value for i, accurately indicating that for this decision, one will be determining the critical index at which the action becomes unfavorable. This infinite-deck index value, by the way, works out to be +5.2-comparing it favorably to both Wong’s and Braun’s 4-deck index value of +6.
Of the 37 infinite-deck doubling and splitting indices that may be easily calculated from Chart #5, using Griffin’s tables for m, 26 round off precisely to Wong’s 4-deck decisions. The algebraic method consistently produces index values comparable to computer-derived values for every decision I have tested-and I have tested many examples for every type of decision for which Griffin provides data. The indices that Wong changes when switching from his Hi-Lo to Halves counts likewise change when calculated thus algebraically.
I will note that a major difference between Wong’s and Braun’s indices, relative to the algebraic indices, is that Wong’s methodology produces indices which correspond more precisely to indices produced via algebraic and linear assumptions. An objective examination of both Wong’s and Braun’s methods will, no doubt, be done in time. Should Braun’s methods prove superior, then certain assumptions regarding algebraic error, depending on how Wong erred, may be made. Should Wong’s methods prove to be superior, the theoretical implications are interesting.
In my original version of this paper, I stressed the major value of the algebraic formula would be for players who play in single-deck games, and whose current systems do not provide “two-card” hit-stand decisions. From Griffin’s table of “Average Gains for Varying Basic Strategy” (p. 30, Theory of Blackjack), note that some hit-stand decisions alone are worth more than all pair-splitting decisions combined.
Some of these most valuable hit-stand decisions, such as the 13 vs. 2 previously analyzed, can be most efficiently played in single-deck games by using two-card decisions. It occurred to me that a player might potentially realize more profit in a single-deck game from learning two-card decisions for 16 vs. X, and both 12 and 13 vs. 2 and 3, than by learning all hard and soft doubling and pair-splitting decisions combined.
From the practical point of view, the only pair-splitting indices worth learning at all are splitting X-X vs. 4, 5, and 6. Of the doubling indices, only 10 and 11 vs. X, and 11 vs. ace are worth varying basic strategy for. A sophisticated player would memorize strategy indices according to potential profitability.
Since publication of my original paper, Peter Griffin has pointed out to me that the method of computing the gain from learning two-card indices as opposed to learning a single non-composition-dependent index for any decision is to calculate the correlation of the count system for each two-card hand to obtain a weighted average correlation for the decision, and comparing this to the correlation of the non-composition-dependent decision. Learning two-card indices is not, alas, practical, as such strategies will not raise the simple correlation of the count system sufficiently to warrant the increased memory effort.
I will note, however, that the recommendations of most systems developers to learn and utilize strategy tables for pair-splitting, surrender, and most double-down decisions are ill-considered, since the potential gains from such strategies are so negligible that most players should not chance making errors by attempting to employ such indices. The information provided in Theory of Blackjack, in conjunction with the formula presented in this paper, is more than sufficient to develop a count strategy for any balanced count system, as complete as any player could practically apply at the tables.
Until system sellers analyze and incorporate into their systems the wealth of information in Griffin’s Theory of Blackjack, serious players should study this book themselves.
As I pointed out in The Blackjack Formula, the financial opportunities of blackjack, as a “get-rich-quick” racket, are largely imaginary. The very effective casino countermeasures, which Thorp had predicted in his 1962 edition of Beat the Dealer were inevitable, have been largely ignored by most systems sellers since. Casinos have learned that it is in their interest to keep the blackjack profit myths alive. However, many casinos do not offer games exploitable to any profitable end by card counting. The profit potential of even the best games available in this country can only be realistically taken advantage of by highly knowledgeable players with sizeable bankrolls. A present-day professional card counter must enter the game with the same attitude, preparation, inside information, and ability to withstand fluctuations, as any investor of sizeable amounts of money on Wall Street.
In his bibliography for Theory of Blackjack, Peter Griffin states that if he were to recommend one book, and one book only, on the subject, it would be the 1966 edition of Thorp’s Beat the Dealer. My recommendation for a second book on this subject would be, without question, Griffin’s Theory of Blackjack. For the serious blackjack student, Griffin’s work stands alone as a detailed analysis of the probabilities and possibilities of applied blackjack strategy.
Chart 1, All Combinations of Cards that Total Hard Sixteen
X-6
7-5-3-A
5-5-3-2-A
X-5-A
7-5-2-2
5-5-2-2-2
X-4-2
7-5-2-A-A
5-5-2-2-A-A
X-4-A-A
7-5-A-A-A-A
5-5-2-A-A-A-A
X-3-3
7-4-4-A
5-5-4-4-3
X-3-2-A
7-4-3-2
5-4-4-2-A
X-3-A-A-A
7-4-3-A-A
5-4-4-A-A-A
X-2-2-2
7-4-2-2-A
5-4-3-3-A
X-2-2-A-A
7-4-2-A-A-A
5-4-3-2-2
X-2-A-A-A-A
7-3-3-3
5-4-3-2-A-A
9-7
7-3-3-2-A
5-4-3-A-A-A-A
9-6-A
7-3-3-A-A-A
5-4-2-2-2-A
9-5-2
7-3-2-2-2
5-4-2-2-A-A-A
9-5-A-A
7-3-2-2-A-A
5-3-3-3-2
9-4-3
7-3-2-A-A-A-A
5-3-3-3-A-A
9-4-2-A
7-2-2-2-2-A
5-3-3-2-2-A
9-4-A-A-A
7-2-2-2-A-A-A
5-3-3-2-A-A-A
9-3-3-A
6-6-4
5-3-2-2-2-2
9-3-2-2
6-6-3-A
5-3-2-2-2-A-A
9-3-2-A-A
6-6-2-2
5-3-2-2-A-A-A-A
9-3-A-A-A-A
6-6-2-A-A
5-2-2-2-2-A-A-A
9-2-2-2-A
6-6-A-A-A-A
4-4-4-4
9-2-2-A-A-A
6-5-5
4-4-4-3-A
8-8
6-5-4-A
4-4-4-2-2
8-7-A
6-5-3-2
4-4-4-2-A-A
8-6-2
6-5-3-A-A
4-4-4-A-A-A-A
8-6-A-A
6-5-2-2-A
4-4-3-3-2
8-5-3
6-5-2-A-A-A
4-4-3-3-A-A
8-5-2-A
6-4-4-2
4-4-3-2-2-A
8-5-A-A-A
6-4-4-A-A
4-4-3-2-A-A-A-A
8-4-4
6-4-3-3
4-4-2-2-2-2
8-4-3-A
6-4-3-2-A
4-4-2-2-2-A-A
8-4-2-2
6-4-3-A-A-A
4-4-2-2-A-A-A-A
8-4-2-A-A
6-4-2-2-2
4-3-3-3-3
8-4-A-A-A-A
6-4-2-2-A-A
4-3-3-3-2-A
8-3-3-2
6-4-2-A-A-A-A
4-3-3-3-A-A-A
8-3-3-A-A
6-3-3-3-A
4-3-3-2-2-2
8-3-2-2-A
6-3-3-2-2
4-3-3-2-2-A-A
8-3-2-A-A-A
6-3-3-2-A-A
4-3-3-2-A-A-A-A
8-2-2-2-2
6-3-3-A-A-A-A
4-3-2-2-2-2-A
8-2-2-2-A-A
6-3-2-2-2-A
4-3-2-2-2-A-A-A
8-2-2-A-A-A-A
6-3-2-2-A-A-A
4-2-2-2-2-A-A-A-A
7-7-2
6-2-2-2-2-A-A
3-3-3-3-2-2
7-7-A-A
6-2-2-2-A-A-A-A
3-3-3-3-2-A-A
7-6-3
5-5-5-A
3-3-3-3-A-A-A-A
7-6-2-A
5-5-4-2
3-3-3-2-2-2-A
7-6-A-A-A
5-5-4-A-A
3-3-3-2-2-A-A-A
7-5-4
5-5-3-3
3-3-2-2-2-2-A-A
3-3-2-2-2-A-A-A-A
Chart 2, m (Favoribility) 51-card Deck, Dealer Upcard Removed (+/-) Dealer Stands on Soft 17
2
3
4
5
6
7
8
9
X
A
17
9.34
16
18.94
23.34
28.38
32.81
29.59
-8.15
-7.61
-3.36
-0.68
-13.71
15
12.87
16.90
21.50
25.18
22.53
-12.25
-11.37
-7.03
-4.31
-16.68
14
7.07
10.47
14.28
17.61
15.51
-12.58
-19.29
13
1.50
3.89
7.13
10.41
8.51
12
-4.43
-2.33
0.55
3.22
1.49
Chart 3, i (Inner Product), Dubner Count, 51-card Deck, Dealer Upcard Removed Dealer Stands on Soft 17
2
3
4
5
6
7
8
9
X
A
17
-51.72
16
-75.91
-81.75
-86.08
-90.59
-81.22
-32.88
-35.44
-32.60
-26.44
-69.76
15
-81.00
-86.54
-90.57
-95.03
-83.08
-42.08
-41.84
-39.08
-33.08
-72.44
14
-68.95
-74.35
-78.46
-82.70
-71.46
-32.72
-57.44
13
-58.54
-64.14
-68.17
-71.80
-61.42
12
-50.42
-55.63
-59.37
-62.86
-52.95
Chart 4, i (Inner Product), Dubner Count, Infinite Deck (p=10) Dealer Stands on Soft 17
2
3
4
5
6
7
8
9
X
A
17
-12.80
16
-19.32
-20.95
-22.51
-23.72
-19.97
-8.22
-8.86
-8.15
-7.73
-17.44
15
-20.43
-21.99
-23.45
-24.65
-21.34
-10.82
-10.46
-9.77
-9.47
-18.21
14
-17.32
-18.78
-20.24
-21.38
-18.31
-8.18
-14.47
13
-14.69
-16.13
-17.48
-18.46
-15.67
12
-12.63
-13.97
-15.15
-16.03
-13.42
11
16.45
15.04
12.92
10.88
23.78
10
16.07
20.40
18.63
15.63
10.10
26.53
9
15.50
16.61
18.03
19.00
18.08
20.17
17.52
AA
31.83
30.89
29.58
28.42
40.11
99
21.15
22.40
25.15
28.49
24.22
8.96
15.88
11.51
8.87
88
-9.42
13.03
66
17.50
21.23
18.75
REFERENCES
Griffin, Peter: Theory of Blackjack (Las Vegas: Huntington Press, 1979)
Baldwin, Cantey, Maisel, McDermott: “The Optimum Strategy in Blackjack,” (Journal of the American Statistical Society, Vol. 51, 1956)
Braun, Julian H.: How to Play Winning Blackjack (Chicago: Data House, 1980)
Wong, Stanford: Professional Blackjack (revised) (La Jolla, CA: Pi Yee Press, 1980)
Revere, Lawrence: Playing Blackjack as a Business (Seacaucus, NJ: Lyle Stuart, 1971, ’73, ’75, ’77)
Braun, Julian H.: The Development and Analysis of Winning Strategies for the Casino Game of Blackjack (Chicago: Julian Braun, 1974)
Thorp, Edward O.: Beat the Dealer (New York: Random House, 1962, ’66)
Wong, Stanford: Blackjack World (La Jolla, CA: Pi Yee Press, October 1980)
Snyder, Arnold: The Blackjack Formula (Berkeley, CA: R.G. Enterprises, 1980)
[Note: If you are the kind of math geek, like me, who can actually make it to the end of an article like this, you may be interested in The Blackjack Shuffle Tracker’s Cookbook, in which I use algebra to calculate the value of various casino-style shuffles to a shuffle-tracker, and got results that overturned the conventional wisdom of the time. — Arnold Snyder]