"Big Pig" dice game code goes in infinite loop :C - c

Hello I'm new to c and i'm currently learning with my university curriculum so i need to abide by these rules: We can't use arrays or global variables.
So i've been trying to make a dice game named "big pig". I'm right now creating the function the computer is going to use to play the game called "play_computer()". There is also a function called "computer_strategy_decider()".Computer_strategy_decider() is supposed to pick from yes or no. I just made a rand function that calls either 1 or 2 to make that work. Play_computer() let's you pick two dices and from there it needs to calculate the score. If you pick only one one, then your score doesn't increase and your game is terminated. If you et two ones you get 25 added. If you get any other double value for example a , is added such as (a+a)*2 or 4*a. And lastly if you get two random numbers the computer gets to decide if it wants to continue. That's where the computer_strategy_decider() comes in..
The problem comes with the play_computer() function. Everything seems to be working well when the computer rolls two different values and doesn't continue. It terminates ok. But if it wants to continue it goes in an infinite loop. The infinite loop also has the same dice values.
The same loop happens when doubles are rolled. Something in my code doesn't loop properly. I don't know whether it is something to do with rand() or not. I don't think it's rand() since i use rand() on computer_strategy_decider(). My theory is it's hopefully something small i have missed.
My code was working an hour ago before i added some changes so that's why im frustrated haha.
#include <stdio.h>
#include <math.h>
#include <stdlib.h>
#include <time.h>
int computer_strategy_decider(){
int deci;
srand(time(NULL));
deci=1+ (int) rand() % 2;
return deci;}
int play_computer(round_number,strategy){
int roll_1,roll_2,cntrl_var=0,score_comp=0;
char answ;
printf("\nRound %d-- My Turn:",round_number);printf("\n---------------------------------");
while(cntrl_var==0){
srand(time(NULL));
roll_1 = 1 + (int) rand() % 6;
roll_2 = 1 + (int) rand() % 6;
printf("\nI got --> [Dice 1]: %d [Dice 2]: %d",roll_1,roll_2);
if(roll_1==roll_2 && roll_1!=1){
score_comp=score_comp+roll_1*4;
printf("\nScore: %d",score_comp);printf("\nDoubles! Roll again!");}
else if(roll_1==1 && roll_2==1){
score_comp=score_comp+25;
printf("\nScore: %d",score_comp);printf("\nDoubles! Roll again!");}
else if(roll_1==1 || roll_2==1){
cntrl_var=1;
printf("\nYou got a single one! End of your turn!");}
else{
score_comp=score_comp+roll_1+roll_2;
printf("\nScore: %d",score_comp);
while(cntrl_var==0){
printf("\nDo you want to continue (Y/N)?");
if (strategy==1){
printf("Y");
break;}
else if (strategy==2){
printf("N");
break;}}
if (strategy==1)
cntrl_var=0;
else if (strategy==2)
cntrl_var=1;
else
cntrl_var=0;}}
printf("\nMy Score: %d",score_comp);
return score_comp;}
int main(){
int round_no=1,deci;
deci=computer_strategy_decider();
play_computer(round_no,deci);
}

I put the srand in the while loop and that caused the srand to be called multiple times. So i put the srand above the loop. That fixed it!

Related

Monty Hall Problem C Monte Carlo Simulation

So after seeing another video for the Monty Hall Problem and since I learned about Monte Carlo simulation methods, I thought I would try to find the percentage 66,66% of winning the game if you switch doors. The problem is that I get 50%, and one thing that worried when thinking up the algorithm is if my model was correct. I had 2 random guesses implemented, one for choosing door 1 to 3 with 1 in 3 chances and one for choosing to switch doors with 1 in 2 chances. The if statements were for assigning the doors with the prizes and for the different possibilities for each of those guesses. I don't know if I can reduce that part, but it works for now (I think). Where was my thinking incorrect? And can you suggest a fix to my algorithm? Thank you very much!
#include <stdio.h>
#include <math.h>
#include <stdlib.h>
#include <time.h>
int main()
{
int seed=time(NULL);
srand(seed);
double u, random[2], suitch=0.0, total=0.0;
int nall=10000000, nright=0, i, door[3], k, j;
for(j=0; j<nall; j++)
{
for(i=0; i<3; i++)
random[i]=0.0, door[i]=0;
for(i=0; i<2; i++)
{
u=(1.*rand())/RAND_MAX;
random[i]=3.*u;
//printf("%lf\t%lf\n",u,random[i]);
}
suitch=2.*u;
//printf("%lf\n",suitch);
if(floor(random[0])==0)
door[0]=1, door[1]=0, door[2]=0;
else if(floor(random[0])==1)
door[0]=0, door[1]=1, door[2]=0;
else if(floor(random[0])==2)
door[0]=0, door[1]=0, door[2]=1;
for(i=0; i<3; i++)
//printf("%d\t",door[i]);
if((floor(random[1])==0)&&(floor(suitch)==0))
k=door[0];
else if((floor(random[1])==1)&&(floor(suitch)==0))
k=door[1];
else if((floor(random[1])==2)&&(floor(suitch)==0))
k=door[2];
else if((floor(random[1])==0)&&(floor(suitch)==1))
{
if(door[1]==1)
k=door[1];
else if(door[1]==0)
k=door[2];
}
else if((floor(random[1])==1)&&(floor(suitch)==1))
{
if(door[0]==1)
k=door[0];
else if(door[0]==0)
k=door[2];
}
else if((floor(random[1])==2)&&(floor(suitch)==1))
{
if(door[0]==1)
k=door[0];
else if(door[0]==0)
k=door[1];
}
if(k==1)
nright++;
}
total=1.*nright/nall;
printf("%d\t%d\t%lf\t", k, nright, total);
return 0;
}
I've been looking at your code way too long, unable to see the problem. What an idiot i've been, lol. There is no problem with the code (except for an accidental, and luckily benign, memory overrun). The problem is what are you trying to simulate.
You simulate 10000000 games, where in half of the cases the player decides to keep his door (his chance is 33.3% in this case) and half of the cases where the player decides to switch (his chance is 66.7% in this case). Of course you are getting 50%, cause thats what you are simulating.
Set suitch = 1, permanently, and you'll... get 66.7%
And yea... please make random 3 elements long or stop initing at the beginning to fix the memory overrun, and comment out the former debug for(i=0; i<3; i++), cause you are running the simulation ifs chain 3 times each iteration for no good reason. But thats unrelated :-)

Galton Box/Bean Machine-C

I want to code a simple bean machine program. The program will accept user input for the number of balls and the number of slots, and will calculate the path of each ball. The number of balls in each slot will be printed as a histogram as well.
I tried my best to keep the code short and sweet, yet the best I have managed is 112 lines long. When I ran my code, I received no errors. However, the output seems to have run into some sort of an infinity loop (The '#' symbol which was used to represent numbers in the histogram keeps on printing forever for some reason unknown to me).
Apparently, there is something wrong with my logic somewhere... or a silly little mistake in syntax(but it would have shown up as error, wouldn't it?)... In a nutshell, I cannot figure out exactly what is the problem. (I attempted to walk through the whole code process from start to finish, but my mind kept getting tangled up somewhere in the middle of the code, nowhere near the end of the code either).
Where exactly does my logic go wrong?(Or have I taken the wrong approach to the whole problem?) I do not wish to know the correct code, so that I am able to learn during the whole process of re-editing my code.
Any help (hopefully no model-code answers though), even as a single comment, is tremendously appreciated! :)
This is my code:
#include <stdio.h>
#include <stdlib.h>
#include <math.h>
#include <conio.h>
#include <stdbool.h>
#include <time.h>
//Pls excuse my extensive use of libraries even though I don't really use them
int intchecker(float x)
{
if (floor(x)==x && ceilf(x)==x)
{
return 0;
}
else {
return 1;
}
}
int main(){
char line[] = " +----+----+----+----+----+----+----+----+----+----+---+";
char numbers[] = " 0 5 10 15 20 25 30 35 40 45 50";
float balls,slots;
int slotarry[9],tlevel,ballnum,column,lcounter=0,slotsduplicate=1,y;//tlevel-number of levels in the triangle
srand(time(NULL));
int r;
printf("==========================================================\nGalton Box Simulation Machine\n==========================================================\n");
printf("Enter the number of balls [5-100]: ");
scanf("%f",&balls);
while (balls>100 || balls<5) {
printf("\nInput is not within the range. Please try again.");
printf("\nEnter the number of balls [5-100]: ");
scanf("%f",&balls);
}
while (intchecker(balls)==1) {
printf("\nInput is not an integer. Please try again.");
printf("\nEnter the number of balls [5-100]: ");
scanf("%f",&balls);
}
printf("Enter the number of slots [2-10] : ");
scanf("%f",&slots);
while (slots>10 || slots<2) {
printf("\nInput is not within the range. Please try again.");
printf("\nEnter the number of slots [2-10] : ");
scanf("%f",&slots);
}
while (intchecker(slots)==1) {
printf("\nHow can there be a fraction of a slot? Please re-enter slot number.");
printf("\nEnter the number of slots [2-10] : ");
scanf("%f",&slots);
}
tlevel=slots-1;
for(ballnum=1,column=0;balls>0;balls--,ballnum++,column++){
if (column%5==0){
printf("\n");
}
if (ballnum<10){
printf("[0%d]",ballnum);
}
else{
printf("[%d]",ballnum);
}
for(;tlevel>0;tlevel--){
r = rand() % 2;
if (r==0){
printf("R");
}
else {
printf("L");
lcounter++;
}
}
slotarry[lcounter]++;
tlevel=slots-1;
lcounter=0;
printf(" ");
}
printf("\n\n%s",numbers);
printf("%s",line);
char line2[] = "\n +----+----+----+----+----+----+----+----+----+----+---+";
for(;slotsduplicate<=slots;slotsduplicate++){
if (slotsduplicate<10){
printf("0%d|",slotsduplicate);
}
else{
printf("%d|",slotsduplicate);
}
y=slotarry[slotsduplicate];
if (y==0){
printf(" 0");
}
else{
for (;y>0;y--){
printf("#");
}
printf(" %d",slotarry[slotsduplicate]);
}
printf("%s",line2);
}
return 0;
}
Note:This is not completely error-free. This is just my first draft. I just wish to find out why there is an infinite loop.
Here's how I found the problem. First of all, I think it is a bit of a code smell to have a for loop without anything in the initial assignment section. Couple that with the fact that it seems to print # forever, and it looks like y has a garbage value at the beginning of the loop to print the #s.
So I ran your code in the debugger and paused it when it started printing loads of hashes. I checked the value of y and sure enough it was some unfeasibly high number.
Then I checked where y comes from and found you get it from slotarray. I printed it in the debugger and found that all the values in it were unfeasibly high or massively negative numbers. Obviously, slotarray wasn't being initialised correctly, so I looked for where it was initialised and bingo!
Stack variables (of which slotarray is one) must be explicitly initialised in C. I fixed your code with a call to memset.
The whole debugging process I have just outlined took something less than a minute.
ETA As #EOF points out, there is another bug in that slotarray is defined to contain nine slots (indexed 0 - 8) but you allow people to enter 10 slots. This is a buffer overflow bug.

How do I create a "twirly" in a C program task?

Hey guys I have created a program in C that tests all numbers between 1 and 10000 to check if they are perfect using a function that determines whether a number is perfect. Once it finds these it prints them to the user, they are 6, 28, 496 and 8128. After this the program then prints out all the factors of each perfect number to the user. This is all fine. Here is my problem.
The final part of my task asks me to:
"Use a "twirly" to indicate that your program is happily working away. A "twirly" is the following characters printed over the top of each other in the following order: '|' '/' '-' '\'. This has the effect of producing a spinning wheel - ie a "twirly". Hint: to do this you can use \r (instead of \n) in printf to give a carriage return only (instead of a carriage return linefeed). (Note: this may not work on some systems - you do not have to do it this way.)"
I have no idea what a twirly is or how to implement one. My tutor said it has something to do with the sleep and delay functions which I also don't know how to use. Can anyone help me with this last stage, it sucks that all my coding is complete but I can't get this "twirly" thing to work.
if you want to simultaneously perform the task of
Testing the numbers and
Display the twirly on screen
while the process goes on then you better look into using threads. using POSIX threads you can initiate the task on a thread and the other thread will display the twirly to the user on terminal.
#include<stdlib.h>
#include<pthread.h>
int Test();
void Display();
int main(){
// create threads each for both tasks test and Display
//call threads
//wait for Test thread to finish
//terminate display thread after Test thread completes
//exit code
}
Refer chapter 12 for threads
beginning linux programming ebook
Given the program upon which the user is "waiting", I believe the problem as stated and the solutions using sleep() or threads are misguided.
To produce all the perfect numbers below 10,000 using C on a modern personal computer takes about 1/10 of a second. So any device to show the computer is "happily working away" would either never be seen or would significanly intefere with the time it takes to get the job done.
But let's make a working twirly for perfect number search anyway. I've left off printing the factors to keep this simple. Since 10,000 is too low to see the twirly in action, I've upped the limit to 100,000:
#include <stdio.h>
#include <string.h>
int main()
{
const char *twirly = "|/-\\";
for (unsigned x = 1; x <= 100000; x++)
{
unsigned sum = 0;
for (unsigned i = 1; i <= x / 2; i++)
{
if (x % i == 0)
{
sum += i;
}
}
if (sum == x)
{
printf("%d\n", x);
}
printf("%c\r", twirly[x / 2500 % strlen(twirly)]);
}
return 0;
}
No need for sleep() or threads, just key it into the complexity of the problem itself and have it update at reasonable intervals.
Now here's the catch, although the above works, the user will never see a fifth perfect number pop out with a 100,000 limit and even with a 100,000,000 limit, which should produce one more, they'll likely give up as this is a bad (slow) algorithm for finding them. But they'll have a twirly to watch.
i as integer
loop i: 1 to 10000
loop j: 1 to i/2
sum as integer
set sum = 0
if i%j == 0
sum+=j
return sum==i
if i%100 == 0
str as character pointer
set *str = "|/-\\"
set length = 4
print str[p] using "%c\r" as format specifier
Increment p and assign its modulo by len to p

Really weird, debug of the program works OK but when i run it i get a weird result

i'm trying to create a 'game' in C programming which throws 2 dices for the user, throws 2 dices for the PC, and whoever gets the bigger sum wins,
when i debug it in visual studio i see good results both in the variable values and the console window, but when i run it without debugging the user and the PC both always get the same value for their dices (user gets 2 and 2, and PC gets 2 and 2, for example).
Can anyone solve it? i looked at it for the last 3 hours and i just can't find what's the problem.
#define _CRT_SECURE_NO_WARNINGS
#include <stdlib.h>
#include <stdio.h>
#include <time.h>
int throwDice();
int diceSum();
int main()
{
int res1;
res1=diceSum();
if(res1==0)
printf("It is a tie!\n");
if(res1==1)
printf("You Won!\n");
if(res1==-1)
printf("You Lost\n");
}
int throwDice()
{
int i;
srand((unsigned)time(NULL));
i = (rand()%(6-1)) + 1;
return i;
}
int diceSum()
{
int j,a=0,b,c=0,d=0;
int array[4];
for(j=1;j<=2;j++)
{
array[c]=throwDice();
a=a+array[c];
c++;
}
for(b=1;b<=2;b++)
{
array[c]=throwDice();
d=d+array[c];
c++;
}
printf("You got %d and %d.\nYour opponent got %d and %d.\n",array[0],array[1],array[2],array[3]);
if(a==d)
return 0;
if(a>d)
return 1;
else
return -1;
}
Everytime you call throwDice, you are re-initializing your random number generator with the current time.
The accuracy of time is only 1-second, so in a single run of this program, the time doesn't change, so you get the same results.
You are supposed to call srand ONLY ONCE, near the start of your program.
From the documentation
"Two different initializations with the same seed will generate the same succession of results in subsequent calls to rand."
The random number generator doesn't just pull random numbers out of nowhere. When you seed it you're giving it a good number to start with, and from there it can generate a bunch of random numbers.
The problem is, it will give you a sequence of random numbers, but that sequence will be the same if you give it the same seed twice in a row. That's why it's important that the value you seed it with is relatively random. And hey, the time is pretty random, what are the chances two people will end up running the program at exactly the same time?
What you're doing, is seeding the generator before every call to rand, instead of seeding it at the beginning and letting the random number generator do its job. Since time only returns the time in seconds (see man page), you'll get the same number every time the loop runs until the start of the next second.

generate random a number number between two limits

Is there a way to generate a number number between two limits without using the sran and rand function
What it is that i have a while (1) (super loop) which calls a function every 10 ms seconds. in the function a for loop is used to create a delay, but, everytime the function is access the delay should be different but between two limts, 8 and 2 ms, for(x=0;x<random_number;x++)
Try bellow hope this will answer your question.
#include<stdio.h>
#include <stdlib.h>
int main()
{
int iMaxRand =100,ii=0 ;
while( ii < 100)
{
printf("Random Number is:%d\n", rand() % iMaxRand);
ii++;
}
printf("\n");
}

Resources