I'm new at programming, new on this site too, so hello...
I'm attempting to obtain a running total for integers one thru 10, but I'm getting gibberish answers and I just can't understand why.
To attempt to find out what was going wrong, I added the
printf(" running total is %d\n", sum);
line to the while loop, but just got more of the same nonsense...
please see http://codepad.org/UxEw6pFU for the results....
I'm sure this has a blindingly obvious solution...I'm just too dumb to see it though!
anyone know what I'm doing wrong?
#include <stdio.h>
int main(void) {
int count,sum,square;
int upto=10;
count = 0;
square = 0;
while (++count < upto) {
square = count * count;
printf("square of %d is %d",count,square);
sum =square + sum;
printf(" running total is %d\n", sum);
}
printf("overall total of squares of integers 1 thru 10 is %d\n", sum);
return 0;
}
You need to initialize sum to 0.
EDIT As others have stated after the fact, the reason you're seeing garbage is because sum isn't initialized and contains whatever is in memory. It can be anything, and your use of it with sum = square + sum is going to add square to the uninitialized value.
You are never initializing the value of sum.
The first time your code runs
sum = square + sum;
The value of sum (on the right side) is an arbitrary number because it has not been initialized. Therefore, the resulting value of sum (on the left side) is that arbitrary number plus square.
Simply add a sum = 0 statement like you have for count and square already.
Right off the bat, you do not initialize 'sum' to anything.
edit: A cleaned up version, though depending on compiler, you might need to enforce C99 mode, otherwise older compilers might not support initial declarations in the for loop.
#include <stdio.h>
int main()
{
const int COUNT_MAX = 10;
int sum = 0;
for ( int i = 1; i <= COUNT_MAX; ++i )
{
sum += i*i;
}
printf("The sum of squares from 1 to 10 is: %d\n", sum);
return 0;
}
Initialize sum with 0, otherwise it contains arbitrary data:
sum = 0;
See: http://codepad.org/e8pziVHm
sum is not initialized
You should do :
sum=0;
and remove
square=0;
Related
I made a program and then it didn't worked. As in the other post someone advised me to trying to debug my programs, I learned it and debugged this one. Probably it has some basic errors of writting but that's because I've changed a lot of thing recently to understand what's happening. In third time when I input a value on screen on that loop, it changes my var "i" to that value instead of keeping that number in my array "grade".
First I tried to make it all in one loop, the first one, but as always it didn't help much, and then i wrote the code by this manner as you'll see
#include <stdio.h>
#include <stdlib.h>
int main()
{
int j=0,sum=0,i=0;
int grade[]={0};
for(;j<100;j++){
printf("Type a grade:\t");
scanf("%d",&grade[j]);
if(grade[j]<10||grade[j]>20){
break;
}
}
for(;i<j;i++){
sum=sum+grade[i];
}
float average=sum/j;
printf("The average is: %.2f\n",average);
system("pause");
return 0;
}
The exercicise says that you need to read "x" grades from a student and it needs to be between 10 and 20, if the number is out of this range it stops the loop.After I just need to calculate the average os these grades.I don't really know if my var average is being calculated correctly, cause I didn't could reach over there because of my problem. If you input 11, 12 and 13 it should give to a sum of 36, but gaves me 26, i don't know how.
Erik, you should define your array in a coherent way. To allow the necessary number of elements, try defining a numeric constant. You could use it for both define the number of iterations of your cycle and the size of your grade array. You can also avoid a new cycle to calculate the sum of the array, you can do this operation while reading the grades, using only one for loop. Try this way:
#include <stdio.h>
#include <stdlib.h>
#define MAX_GRADES 100
int main()
{
int j,sum=0,i;
float average;
int grade[MAX_GRADES];
for(j = 0 ; j < MAX_GRADES; j++)
{
printf("Type a grade:\t");
scanf("%d",&i);
if ( (i<10) || (i>20) )
break;
grade[j] = i;
sum += i;
}
if (j > 0)
average = (float)sum/j;
else
average = 0;
printf("The average is: %d, %d, %.2f\n",sum, j, average);
system("pause");
return 0;
}
I have some problem with that. I am trying to learn C programming. Please help me
#include<stdio.h>
int main()
{
int a, factorial;
printf("Please enter a value :" );
scanf("%d", &a);
for (int i = 1; i<=a; i++)
{
a = (a - 1)*a;
}
printf("%d", factorial);
return 0;
}
Well in your code line a = (a - 1)*a; you actually changed your input for getting the factorial. It also will blow your loop. See your for loop will continue as long as your i is less than a, lets say you choose a=3 after first iteration the a itself will become 6, so the for loop will continue until it reach the integer limit and you will get overflow error.
What you should do?
First of all you should use a second variable to store the factorial result, you introduced it as factorial, the way that #danielku97 said is a good way to write a factorial since if you present 0 as input it will also give the correct result of 1. so a good code is:
factorial = 1;
for (int i = 1; i<=a; i++)
{
factorial *= i;
}
But lets say you insist of subtraction, the way you just tried to use, then you need to change the code like:
scanf("%d", &a);
if (a==1 || a==0){
printf("1");
return 0;
}
factorial = a;
for (int i = 1; i<a; i++)
{
factorial *= (a - i)*factorial;
}
You can see that the code just got unnecessarily longer. An if included to correct the results for 1 and 0. Also you need to make sure that i never become like i =a since in that case a-i will be equal to zero and will make the factorial result equal to zero.
I hope the explanations can help you on learning C and Algorithm faster.
Your for loop is using your variable 'a' instead of the factorial variable and i, try something like this
factorial = 1;
for (int i = 1; i<=a; i++)
{
factorial *= i;
}
You must initialize your factorial to 1, and then the for loop will keep multiplying it by 'i' until 'i' is greater than 'a'.
You are modifying the input a rather than factorial and also wrong (undefined behaviour) because you are using factorial uninitialized. You simply need to use the factorial variable you declared.
int factorial = 1;
...
for (int i = 1; i<=a; i++) {
factorial = i*factorial;
}
EDIT:
Also, be aware that C's int can only hold limited values. So, beyond a certain number (roughly after 13! if sizeof(int) is 4 bytes), you'll cause integer overflow.
You may want to look at GNU bugnum library for handling large factorial values.
My program asks the user to input an integer and adds the sum of the digits.
#include <stdio.h>
int main (void)
{
int inputNum, sum;
// Gets the inputNum
printf("Type in a positive integer, and this program\nwill calculate the sums of the digits: ");
scanf("%i", &inputNum);
do
{
sum += (inputNum % 10);
inputNum /= 10;
} while (inputNum != 0);
printf("Your sum is: %i\n", sum);
}
But every time it put an integer in, I get a number around 36000. I saw another example online that used %d, so I tried it but it made no difference. Am I just going in the wrong direction from the start? Thanks in advance.
You never initialized sum; it may well be starting with value 1,000,000
Just do:
int inputNum, sum = 0;
First thing, initialize sum to 0. Try then. int sum = 0;
Do it:
int inputNum, sum = 0;
i.e., you have to initialize sum.
Because in C the value of an uninitialized variable cannot be determinated. The value that you are getting is garbage from memory.
Initialize sum as zero. int variables don't have any default value unless its a global or static variable. as a (non-static) local variable inside of a function, it has an indeterminate value.
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 7 years ago.
Improve this question
I've been tasked to code a program that processes a simple 1D array to return its element values, but the compiler has been behaving strangely; outputting more values than I have array elements.. It's also not being fully compliant with one of my statements (one that prints a new line character every 8 elements) and not assigning the largest value to my variable. I think that the other two problems will go away once the first problem is fixed, however.
Here is my brief:
Design, code and test a program that:
Fills a 20 element array (marks) with random numbers between 0 and 100.
Prints the numbers out 8 to a line
Prints out the biggest number, the smallest number and the average of the numbers
And here is my code:
#include <stdio.h>
#include <stdlib.h>
#include <time.h>
int main(){
srand(time(NULL));
int marks[20];
int i = 0;
int sum = 0;
int min;
int max;
for(i;i<=sizeof(marks);i ++){
marks[i] = rand() % 100;
sum += marks[i];
if(i % 8 == 0){
printf("\n");
}
printf("%d ", marks[i]);
if(marks[i]>max){
max = marks[i];
}
else if(marks[i]<min){
min = marks[i];
}
}
printf("\n\nThe minimum value is: %d", min);
printf("\nThe maximum value is: %d", max);
printf("\n\nThe average value is: %d", sum / sizeof(marks));
return 0;
}
Please can someone help me get the correct output?
sizeof() function returns the byte length of the array, so this code "thinks" your array is 20 * whatever byte size ints are on your machine. You will want to just use i < 20 in the loop or go
for (i;i<sizeof(marks)/sizeof(int); i ++) { ...
Note that you probably do not want the <= operator in the for loop, since arrays are 0 indexed, thus marks[20] is actually one beyond the array.
There are two problem I can see that will invoke undefined behavior in your code.
By saying for(i;i<=sizeof(marks);i ++), you're out of bounds.
int min; int max; are not initialized and you're attempting to use it.
to solve this.
Change the for loop condition to for(i; i< 20; i++). Better to use a preprocessor construct like #define SIZ 20 and then make use of it accross your code to make it consistent and robust.
Initialize your local variables. max should be INT_MIN, and min can be INT_MAX. (see limits.h for reference).
To clarify more on point 2, max and min are automatic local variables, and in case not initialized explicitly, it contains indeterminate values.
C11, chapter §6.7.9,
If an object that has automatic storage duration is not initialized explicitly, its value is
indeterminate.
and then, directly from the Aneex J, §J.2, Undefined behaviour,
The value of an object with automatic storage duration is used while it is
indeterminate.
if(marks[i]>max){
max = marks[i];
}
else if(marks[i]<min){
min = marks[i];
}
min and max are not initialized here. Make sure to set your compiler warnings at the highest level, so you get a warning message when you forget to initialize variables.
for(i;i<=sizeof(marks);i ++){
This doesn't make sense. Replace sizeof(marks) with the number of times you want to loop, and use < instead of <=.
For example:
const int num_marks = 20; // or use #define
int marks[num_marks];
for(i = 0; i < num_marks; i++) {}
I can't seem to get it right. The question is "Calculate the Average of 5 Integers using an array"
#include<stdio.h>
#include<stdlib.h>
int main()
{
int avg[5],i,total;
int average;
printf("Enter the marks entered in 5 subjects");
for (i=0; i<5; ++i){
scanf("%d",&avg[i]);
}
for(i=0; i<5; ++i){
total = total + avg[i];
}
average= (float)total/5;
printf("The average of 5 marks is %d",average);
return 0;
}
1) Your answer CAN be a decimal number but you are storing it in an integer which ignores the decimal points.
The variable average should be declared as float average;
The line where you print the result should be changed to printf("The average of 5 marks is %f",average);
2) Initialize the variable total as int total = 0;
In the for loop's first iteration, variable total is used uninitialized, so the result is wrong since it will automatically acquire a garbage value if not initialized explicitly. Take this whole thing:
#include <stdio.h>
int main() {
int avg[5], i, total = 0, average;
printf("Enter the marks entered in 5 subjects");
for (i = 0; i < 5; ++i) {
scanf("%d", &avg[i]);
total += avg[i];
}
average = total / 5;
printf("The average of 5 marks is %d", average);
return 0;
}
You've just declared the variable total but didn't initialize it. So, total contains a garbage value. For that if you add anything to total, the value will not be the correct as expected. It will be added to the garbage value. So,initialize the variable total with 0.
int total=0;
You need not to use type casting. Just declare the average variable as double.
double average=0;
average = total/5.0;
And you should print as printf("The average of 5 marks is %lf", average);
total should be initialized to 0:
int avg[5],i,total=0;
total is a local variable, hence it needs to be initialised. Had it been within file scope (like, a global variable), or a static variable, you might not have to initialise it to 0.
An initialisation could be simply like:
int avg[5], i, total = 0 ;
or
int avg[5], i, total;
total = 0 ;
Why do you need initialisation? Because of this statement:
total = total + avg[i];
Here total is calculated using its previous value. What is total's previous value the first time this statement is encountered? It could be anything, commonly referred to as garbage value, invoking undefined behaviour. Hence, you need initialisation to give this starting value to total. Note, you don't need to initialise average, because its value does not depend on its previous contents.
Another problem is with the concepts of typecasting. Here is the statement:
average= (float)total/5;
You are right about typecasting total to float (you may also have done total/5.0 instead). However, you are storing the result in an integer. This will result in a second typecasting, from the result in float to int.
Hence, you need to declare average as a float.
(Note: If having a float result is not your requirement, and you really need an integral answer, you may ignore this part).
You need to initialize variables before you use them in your code.
What goes WRONG in your code is,
int avg[5],i,total;
where you have not initialized the int variable "total"
Hence this code will use a garbage value
for(i=0; i<5; ++i){
total = total + avg[i];
}
You need this correction
int avg[5], i, total=0;