Continous integration tool for pure C [closed] - c

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I have a set of projects with pure C and Cobol code, I am looking for a continuous integration tool, but i have never used these kind of tools before. My code is in svn and the most important features I am looking for are:
Object version tracking
Compile issues reporting with mails
Does anyone has experience using such tools with C and Cobol code?

I use jenkins for C continuous integration. Jenkins jobs can run bash scripts, so at a minimum you can have the script be make. If make returns non-zero, the build will fail. There is additional stuff you can do to like have gcov/lcov reports collected when you have it up and running as well.

I've used Jenkins a little bit, and it can do all sorts of interesting things, including integration with most version control systems and automatic emails when builds succeed/fail.
It runs as a web-service on anything that can run Java, pretty much.

Related

Environment COBOL, C, DB2 on Ubuntu [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
Corona sent me home and our company has strict rules and I don't have access to the network (IBM mainframe). I'm learning to program in COBOL, C, DB2 environment.
COBOL calls C, C works with DB2.
I'd like to continue working from home but I'm not an administrator.
Do you have any guidence, tips and tricks on how to set up an environment for this in Ubuntu?
So far I have a couple of editors, DB2 installation, gcc, gnu cobol...
Stuck on the embedded sql precompiler...
gnucobol and gnucc both work with Db2-LUW on ubuntu linux.
You can build and run gnucobol programs that use embedded-SQL.
You can build and run gcc programs that use embedded-SQL via the preprocessor that comes with Db2-LUW, or which use the Db2 call level interface.
Each of these products has its own set of documentation pages online.
You have to spend time studying their respective documentation.
Stackoverflow is more suitable for specific programming questions.
You will get better answers if you learn to ask better questions that:
show your code fragment as plain text (not an image).
detail the environment and versions and tools that you use
show the command(s) you run (as plain text)
show the error output (as plain text) and mention the expected result
IBM's Db2-LUW has many example programs and simple build scripts (including for linux) that show how embedded SQL (used from C) can be used.
This was explained to you in January 2020, in answer to your previous question on the same topic.

C# winform application security Vulnerability Testing Tools [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 5 years ago.
Improve this question
Does anyone have any recommendations for good vulnerability testing software for c# window forms (not .net) applications.
Preferably one that can also test with a mysql server or sql server connection.
There is no tool that is going to match a good code reviewer or penetration tester. But a few tips to get you aimed in the right direction:
Static analysis tools like HP Fortify, IBM AppScan, and CheckMarx do a wonderful job at finding security issues with code. But you really need an experienced code reviewer to get the most out of them. Also, they are not cheap! These tools operate by scanning code, and the main requirement is to provide to the tool everything you need to build your software (at least this is the case for Fortify and AppScan, not sure if the same requirement holds for CheckMarx).
IAST tools such as Contrast are also not cheap. However at least in the case of Contrast, they are specifically trying to make it more developer-friendly. IAST tools work by hooking into your binary in your test environment and looking under the hood for bad things that happen.
Dynamic analysis tools such as OWASP ZAP (free) and Burp (not free, but affordable) can run automated scans in your environment, but if you lack experience with these, then your value is limited. These tools work by scanning in a test environment and sending malicious payloads to see how the server responds. A lot of effort is being put to make ZAP work in continuous integration build environments.
All of these should work for the technologies that you are using.

How to find the minimum system requirements needed for the program I wrote in the C language? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I wrote a program in the C language. Now I want to make the system documentation for that program. And, I would like to state the minimum system requirements that are needed to run my my program.
How do I find out what they are?
Things you can do:
Try running your app on the oldest machines you can find.
Remove a couple memory sticks from your computer
Do you have a define _WIN32_WINNT in your application? If not, the windows SDK you use will define the minimum OS requirement.
You can also try compiling with -D_WIN32_WINNT=xx for an older version to see how far back you can go, based on the Windows API calls you use. windows.h is pretty good at hiding APIs for versions newer than the one you specify with _WIN32_WINNT. Then keep that setting to compile your app to create test and release binaries.
Here's the MS doc on versioning with _WIN32_WINNT: https://msdn.microsoft.com/library/windows/desktop/aa383745
Silly me! I forgot to add that you MUST test on the oldest version you specify in your specs + the one most used by your target users.

CLI vs Pure C/C++ Library for a program? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 6 years ago.
Improve this question
Background / Context: I am developing a Linux NAS Server (Like FreeNAS or Rockstor) using Golang, the particular features will be a JSON-REST API so that you can interact with LVM2, shares, packages, etc.
Question: With respect to security, performance, and development time, what are the advantages / disadvantages / best practices of implmenting spawned processes or using a native library for certain features for a program?
Example: For my particular use case, the NAS management system will be using LVM2 to manage volumes. However you can use the CLI to manipulate volumes or you can attempt to use the LVM2 native C API and merge it with Golangs cgo package.
EDIT: Rephrased my question / information.
There are two things that may make using exec in the different variants a nogo: security and speed.
Security: If you shell out with system() or friends, you must be absolutely certain that you don't include any strings in the command that may do funny stuff with your command line. It's the same basic problem as SQL code injection, just at a much lower and even more disastrous layer (obligatory XKCD, just replace "'); DROP TABLE Students;--" with valid sh code along the lines of '"; echo "pwnd', well, you get the idea).
Speed: When you shell out to an existing program, you create a new process, and that may be the performance hit you cannot tolerate. It's perfectly ok if the task for which you shell out takes more than a few milliseconds (process creation is somewhere in the range of a millisecond on linux), but if you want more than a thousand calls per second, you definitely need to avoid this overhead.
If these two points are taken care of or proven to be non-issues, then it's perfectly ok to shell out to other processes.

Puppet and NRPE [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
Can I replace NRPE for Puppet, for managing? If the answer if affirmative, could somebody explain me which are the advantages between both of them?.
Thanks,
No
I think you are comparing two different things.
NRPE is an agent for remote monitoring.
Puppet is a configuration management framework, sort of like make(1) for entire Unix and Mac system configurations.
So, if what you want to do is install software and tweak configurations, Puppet is a good place to start.
While you can setup NRPE to basically allow arbitrary command execution, it's both risky because of NRPE's simple authentication and tedious as it's difficult to manage lots of hosts this way. Puppet is a tool for automating host configuration, which it is fabulous at. If you want to just run a few commands remotely look into mcollective, pssh, mussh, dsh, sshpt, fabric, pdsh, pussh, clusterit -- there are endless others. Even ganglia has a remote command execution framework. Tools similar to Puppet are Chef and CFengine.
DigitalRoss is correct. NRPE and Puppet are in two different spaces.

Resources