Parsing source code - c

I need to parse the source code of different files, each written in a different language, and I would like to do this using C.
To do that, I was thinking of using yacc / lex, but I find them very hard to understand, maybe due to the complete lack of decent documentation (either that, or they really are cryptic).
So my questions are: where can I find some good documentation for yacc / lex, preferably a tutorial style introduction? Or, is there any better way to do this in C? Maybe there's something else I could use instead of yacc / lex, perhaps even written in a different language?

yacc and lex are very powerful tools, built around the theories for compiler construction. To be able to fully understand them you probably need some basics in formal languages, automata theory and compiler construction.
The dragon book is a classic on the subject.

The second half of Kernighan and Pike's The Unix Programming Environment is an extended introduction to programming an interpreter with lex and yacc. The lex coverage is a little light, as they mostly use a custom scanner.

If you like math (the most important clause in this answer), then write your own compiler-compiler, and then write your compiler with that. I did this once because I was getting bored of writing all the functions for all the productions of a compiler which I had started as a recursive-descent compiler, because the available choices in 2004 didn't please me, and because I had free time while job-hunting. I only used the compiler compiler on the one project, and it is not necessarily thoroughly tested, so it is not on github. I was very happy with the grammar file syntax that I devised.
If I had such a need today I might make a different decision. The newer cutting-edge CC's seem to have have changed a lot in the last 8 years.

Related

In what languages besides C can I write a C library?

I want to write a library that is dynamically loadable and callable from C code, but I really don't want to write it in C - the code is security critical, so I want a language that makes it easier to have confidence that my code is correct. What are my options?
To be more specific, I want C programmers to be able to #include this, and -l that, and start using my library just as if I had written it in C. I'd like programmers in other languages to be able to use their favourite tools for linking to C libraries to link to it. Ideally I'd like that to be possible on every platform that supports C, but I'll settle for Linux, Windows and MacOS.
Anything that compiles to native code. So you might Google for that - "languages that compile to native code." See, e.g., Programming languages that compile to native code and have the batteries included
C++ is often the choice for this. Compiles to native code and provided you keep your interfaces simple, easy to write an adapter layer.
Objective C and Fortran are also possible.
It sounds like you are looking for a language with ABI compatibility or which can be described as resulting in native code. So long as it can be compiled to a valid object file (typically an .obj or .o file) which is accepted by the linker, that should be the main criteria. You also then want to write a header file as a convenience for any client code which is written in C (or a closely related language/variant thereof).
As mentioned by others, you need a pretty good reason for choosing a language other than C as it is the lingua-franco of low-level/systems software. Assembler is an option, although harder to port between platforms. D is a more portable - but less widespread - alternative which is designed to produce secure, efficient native code with a minimum of fuss. There are many others.
Almost every security critical application I know of is written in C. I don't believe that there are any other language that has higher real status in producing secure applications.
C is being said to be a poor language for security by people who don't understand.
If you want C programmers to use your library, use C. Doing anything else is tying one hand behind your back whilst trying to walk on a balance beam (the gymnastics equipment). Sure, there are dozens of other languages that are CAPABLE of interfacing to C, but it typically involves using a C layer and then stuffing the C data types into a language specific data type (Java Objects, Python Objects, etc, etc), and when the call is finished, you use the same conversion back to a C data type. Just makes it harder to work with, and potentially slower if you don't get all the design decisions right. And people won't understand the source code, so won't like to use it (see more about this below).
If you want security, then write very good code, wearing your "security aspects" hat firmly on at all times, find a security mailing list or website and post it there for review, take the review comments on board, understand the comments, and fix any comments that are meaningful to fix. Distribute the source code to the users, so people can see what your code does. Those that understand security will know what to look for and understand that you have done a good job (or a bad job, whichever is applicable) - and those who don't will hopefully trust the right pople. If it's good, people will use it. If it's "hidden", and not easy to access, you won't get many customers, no matter what language you use.
Don't worry, you won't reveal anything more from releasing source. If there is a flaw in the code, and it is popular (or important) enough, someone will find the flaw, even if you publish only binaries. For those skilled in reverse engineering, not having source code is only a small obstacle.
Security doesn't stem from using a specific language or a specific tool, it stems from good design and good basic understanding of the problems with security.
And remember security by obscurity (whether that means "hidden source code" or "unusual language" or something else obscure) is false security.
You might be interested in ATS, http://ats-lang.sourceforge.net/. ATS compiles via C, can be as efficient as C, and can be used in a way that is ABI-compatible with C. From the project website:
ATS is a statically typed programming language that unifies implementation with formal specification. It is equipped with a highly expressive type system rooted in the framework Applied Type System, which gives the language its name. In particular, both dependent types and linear types are available in ATS. The current implementation of ATS (ATS/Anairiats) is written in ATS itself. It can be as efficient as C/C++ (see The Computer Language Benchmarks Game for concrete evidence) and supports a variety of programming paradigms
ATS's dependent and linear type system helps produce static guarantees about your code, including various aspects of resource management safety.
Chris Double has been writing a series of articles exploring the power of ATS's type system for systems programming here: http://bluishcoder.co.nz/tags/ats/. Of particular note is this article: http://bluishcoder.co.nz/2012/08/30/safer-handling-of-c-memory-in-ats.html
This document covers aspects of calling back and forth between ATS and C code: https://docs.google.com/document/d/1W6DYQApEqKgyBzMbvpCI87DBfLdNAQ3E60u1hUiMoU0
The main downside is that dependently-typed programming is still a daunting prospect, even for non-systems programming. The syntax of the language is also a bit weird: consider lexical quirks such as the use of abst#ype as a keyword. Finally, ATS is to some degree a research project, and I personally don't know whether it would be sensible to adopt for a commercial endeavour.
Theoretically, it's going to be Fortran: less indirection (as in: my array is [here], not just a pointer to here, and this is true of most but not all of your data structures and variables).
However... There are many gotchas and quirks in Fortran: not, perhaps, as many as in C but you probably know your way around C rather better than Fortran. Which is the point behind most of the comments saying 'Know your code' - but do you really know what your compiler is doing?
Knowing you, I'm prepared to take it on trust that you do, for C. Most programers don't. You do not know and cannot know what a local JVM or JIT compiler does, and that's a black hole in your security model if you're using Java or C# r scripting languages.
Ignore anyone who tells you that the hairy-chested he-men of secure computing write their own assembler: they probably don't even know the security errors they're making in any and all nontrivial projects they release. Know your compiler, indeed.
You could write it in lua - providing a C API to a Lua library is relatively straight forward. C++ is also an option, though of course you'd have to write C wrappers and make sure no exceptions can escape your functions. But honestly, if it's security critical the minor inconveniences of the C language shouldn't be that much of a big deal. What you really should be doing is prove the correctness of your program where feasible, and test extensively where it's not.
You can write a library in Java. JNI is normally used to call C from Java, but it can be used the other way around.
There is finally a decent answer to this question: Rust.

What are resources for building a static analyzer for C in C?

I have a school project to develop a static analyzer in C for C.
Where should I start? What are some resources which could assist me?
I am assuming I will need to parse C, so what are some good parsers for C or tools for building C parsers?
I would first take yourself over to antlr, look at its getting started guide, it has a wealth of information about parsing etc.., I personally use antlr as it gives a choice of code generation targets.
To use antlr you need a c or c++ grammar file, pick of these up and start playing.
Anyway have fun with it..
Probably your best starting point would by Clang (with the proviso that it already has a static analyzer, so unless you want to write one for its own sake, you might be better off using/enhancing the existing one).
Are you sure that you want to write the analyzer in C?
If you were using a modern langauge (e.g. C#, Java, Python), then I would second spgennard's suggestion of ANTLR for the parser.
If writing the analyzer in C is a requirement then you are stuck with lex and yacc (flex and bison) or maybe a hand-crafted parser.
Looks like Uno comes close to what you want to do. It uses lex/yacc and includes the grammar files. The analysis part however is written in C++.
Maybe you can get some more ideas about the how and what from tools listed at SpinRoot. Wikipedia also has some good info.
Parsing is the easiest and least important part of a static analyser. Antlr was already suggested, it should be sufficient for parsing plain C (but not C++). Just a little tip - do not implement your own preprocessor, better reuse the output of gcc -E.
As for the rest, you can take a look at some of the existing analysers sources, namely Clang and CIL, read about an SSA representation and abstract interpretation. Choosing the right intermediate representation for your code is a key.
I doubt it can be an easy task in plain C, so you'd probably end up implementing some sort of DSL on top of it to handle ASTs and transforms. Sounds like something much bigger than a typical school project.

Starting off a simple (the simplest perhaps) C compiler?

I came across this: Writing a compiler using Turbo Pascal
I am curious if there are any tutorials or references explaining how to go about creating a simple C compiler. I mean, it is enough if it gets me to the level of making it understand arithmetic operations. I became really curious after reading this article by Ken Thompson. The idea of writing something that understands itself seems exciting.
Why did I put up this question instead of asking Google? I tried Google and the Pascal one was the first link. The rest did no seem relevant and added to that... I am not a CS major (so I still need to learn what all those tools like yacc do) and I want to learn this by doing and am hoping people with more experience are always better at these things than Google. I want to read some article written in the same spirit as the one I listed above but that which highlights at least the bootstrapping phases of building a simple C compiler.
Also, I don't know the best way to learn. Do I start off building a C compiler in C or some other language? Do I write a C compiler or some other language? I feel questions like this are better answered once I have some direction to explore. Any suggestions?
Any suggestions?
A compiler consists of three pieces:
A parser
An abstract syntax tree (AST)
An assembly code generator
There are lots of nice parser generators that start with language grammars. Maybe ANTLR would be a good place for you to start. If you want to stick to C roots, try lex/yacc or bison.
There are grammars for C, but I think C in its entirety is complex. You'd do well to start off with a subset of the language and work your way up.
Once you have an AST, you use it to generate the machine code that you'll run.
It's doable, but not trivial.
I'd also check Amazon for books about writing compilers. The Dragon Book is the classic, but there are more modern ones available.
UPDATE: There have been similar questions on Stack overflow, like this one. Check out those resources as well.
I advise you this tutorial:
LLVM tutorial
It is a small example on how to implement a "small language" compiler. The source code is very small and is explained step by step.
There is also the C front end library for the LLVM (Low Level Virtual Machine which represent the internal structure of a program) library:
Clang
For what it's worth, the Tiny C Compiler is a pretty full-featured C compiler in a relatively small source package. You might benefit from studying that source, as it's probably significantly easier to understand than trying to comprehend all of GCC's source base, for instance.
This is my opinion (and conjecture) it will be hard to write a compiler without understanding data structures normally covered in undergraduate (post secondary) Computer Science classes. This doesn't mean you cannot, but you will need to know essential data structures such as linked lists, and trees.
Rather than writing a full or standards compliant C language compiler (at least in the start), I would suggest limiting yourself to a basic subset of the language, such as common operators, integer only support, and basic functions and pointers. One classic example of this was Ron Cain's Small-C, made popular by a series of articles written in Dr. Dobbs Journal in I believe the 1980s. They publish a CD with the James Hendrix's out-of-print book, A Small-C Compiler.
What I would suggest is following Crenshaw's tutorial, but write it for a C-like language compiler, and whatever CPU target (Crenshaw targets the Motorola 68000 CPU) you wish to target. In order to do this, you will need to know basic assembly of which ever target you want to run the compiled programs on. This could include a emulator for a 68000, or MIPS which are arguably nicer assembly instruction sets than the venerable CISC instruction set of the Intel x86 (16/32-bit).
There are many potential books that can be used as starting points for learning compiler / translator theory (and practice). Read the comp.compilers FAQ, and reviews at various online book sellers. Most introductory books are written as textbooks for sophomore to senior level undergraduate Computer Science classes, so they can be slow reading without a CS background. One older book that might be more introductory, but easier to read than "The Dragon Book" is Introduction to Compiler Construction by Thomas Parsons. It is older, so you should be able to find an used copy from your choice of online book sellers at a reasonable price.
So I'd say, try starting with Jack Crenshaw's Let's Build a Compiler tutorial, write your own, following his examples as a guide, and build the basics of a simple compiler. Once you have that working, you can better decide where you wish to take it from that point.
Added:
In regards to the bootstrapping process. Since there are existing C compilers freely available, you do not need to worry about bootstrapping. Write your compiler with separate, existing tools (GCC, Visual C++ Express, Mingw / djgpp, tcc), and you can worry about self-compiling your project at a much later stage. I was surprised by this part of the question until I realized you were brought to the idea of writing your own compiler by reading Ken Thomas' ACM Turing award speech, Reflections on Trusting Trust, which does go into the compiler bootstrapping process. It's a moderated advanced topic, and is also simply a lot of hassle as well. I find even bootstrapping the GCC C compiler under older Unix systems (Digital OSF/1 on the 64-bit Alpha) that included a C compiler a slow and time consuming, error prone process.
The other sort-of question was what a compiler tool like Yacc actually does. Yacc (Yet Another Compiler Compiler or Bison from GNU) is a tool designed to make writing a compiler (or translator) parser easier. Based on the formal grammar for your target language that you input to yacc, it generates a parser, which is one portion of a compiler's overall design. Next is Lex (or flex from GNU) which used to generate a lexical analyzer or scanner, which is often used in combination with the yacc generated parser to form the skeleton of the front-end of a compiler. These tools make writer a front end arguably easier than writing an lexical analyzer and parser yourself. Crenshaw's tutorial does not use these tools, and you don't need to either, many compiler writers don't always use them. Of course Crenshaw admits the tutorial's parser is quite basic.
Crenshaw's tutorial also skips generating an AST (abstract syntax tree), which simplifies but also limits the tutorial compiler. It lacks most if not all optimization, and is very tied to the specific programming language and the particular assembly language emitted by the "back-end" of the compiler. Normally the AST is a middle piece where some optimization can be performed, and serves to de-couple the compiler front-end and back-end in design. For a beginner without a Computer Science background, I'd suggest not worrying about not having an AST for your first compiler (or at least the first version of it). I think keeping it small and simple will help you finish writing a compiler, in its first version, and you can decide from there how you want to proceed then.
You might be interested in the book/course The Elements of Computing Systems:Building a Modern Computer from First Principles.
Note that this isn't about building a "pc" from stuff you bought off newegg. It begins with a description of Boolean logic fundamentals, and builds a virtual computer from the lowest levels of abstraction to progressively higher levels of abstraction. The course materials are all online, and the book itself is fairly inexpensive from Amazon.
In the course, in addition to "building the hardware", you'll also implement an assembler, virtual machine, compiler, and rudimentary OS, in a step-wise fashion. I think this would give you enough of a background to delve deeper into the subject area with some of the more commonly recommended resources listed in the other answers.
In The Unix Programming Environment, Kernighan and Pike walk through 5 iterations of making a calculator working from simple C based lexical analysis and immediate execution to yacc/lex parsing and code generation for an abstract machine. Because they write so wonderfully I can't suggest smoother introduction. It is certainly smaller than C, but that is likely to your advantage.
How do I [start writing] a simple C compiler?
There's nothing simple about compiling C. The best simple C compiler is lcc by Chris Fraser and David Hanson. They spent 10 years working on the design to make it as simple as they possibly could, while still generating reasonably good code. If you have access to a university library, you should be able to get their book.
Do I start off building a C compiler in C or some other language?
Some other language. One time I got to ask Hanson what lessons he and Fraser had learned by spending 10 years on the lcc project. The main thing Hanson said was
C is a lousy language to write a compiler in.
You're better off using Haskell or some dialect of ML. Both languages offer functions over algebraic data types, which is a perfect match to the problems faced by the compiler writer. If you still want to pursue C, you could start with George Necula's CIL, which is a big chunk of a C compiler written in ML.
I want to read some article written in the same spirit as the one I listed above but that which highlights at least the bootstrapping phases...
You won't find another article like Ken's. But Andrew Appel has written a nice article called Axiomatic Bootstrapping: A Guide for Compiler Hackers I couldn't find a free version but many people have access to the ACM Digital Library.
Any suggestions?
If you want to write a compiler,
Use Haskell or ML as your implementation language.
For your first compiler, pick a very simple language like Oberon or like P0 from Niklaus Wirth's book Algorithms + Data Structures = Programs. Wirth is famous for designing languages that are easy to compile.
You can write a C compiler for your second compiler.
A compiler is a complex subject matter that covers aspects of
Input processing involving Lexing, Parsing
Building a symbol store of every variable used such as an Abstract Syntax Tree (AST)
From the AST tree, transpose and build a machine code binary based on the syntax
This is by no means exhaustive as it is an abstract bird's eye view from the top of a mountain, it boils down to getting the syntax notation correct and ensuring that malformed inputs do not throw it off, in fact a good input processing should never fall on its knees no matter how malformed, terrible, abused cases of input that gets thrown at it. And, also in deciding and knowing what output is going to be, is it in machine code, which would imply you may have to get to know the processor instructions intimately...including memory addressing for variables and so on...
Here are some links for you to get started:
There was a Jack Crenshaw's port of his code for C....(I recall downloading it months ago...)
Here's a link to a similar question here on SO.
Also, here's another small compiler tutorial for Basic to x86 assembler compiler.
Tiny C Compiler
Hendrix's Small C Compiler found here.
It might be worthwhile to learn about functional programming, too. Functional languages are well-suited to writing a compiler both in and for. My school's intro compilers class contained an intro to functional languages and the assignments were all in OCaml.
Funny you should ask this today, since just a couple days ago I wrote a lambda calculus interpreter. Lambda calculus is the granddaddy of all functional languages. It's just 200 lines long (in C++, incl. error reporting, some pretty printing, some unicode) and has a two-phase structure, with an intermediate format that could be used to generate code.
Not only is starting small and building up the most practical approach to compilers, it also encourages good, modular, organizational practice.
A compiler is a very large project, although I suppose it wouldn't hurt to try.
I know of at least one C compiler written in Pascal, so it's not the most insane thing you could do. I personally would pick a more modern language in which to implement my C compiler project, both for the simplicity (it's easy to d/l packages for Python, Ruby, C, C++ or Java) and because it will look better on your resume.
In order to do a compiler as a beginner project, though, you will need to drink all of the Agile kool-aid.
Always have something running, even if it doesn't do much of anything. Add things to your compiler only in small steps. ("Frequent releases".) Pick a viciously tiny subset of the language and implement that first. (Support only i = 0; at first and expand things from there.)
If you want a mind-blowing experience that teaches you how to write compilers that compile themselves, you need to read this paper from 1964.
META II a syntax-oriented compiler writing language by Val Schorre.
In 10 pages, it tells you how to write compilers, how to write meta compilers, provides a virtual metacompiler instruction set, and a sample compiler built with the metacompiler.
I learned how to write compilers from this paper back in the late 60s, and used the ideas to construct C-like langauges for several minicomputers and microprocessors.
If the paper is too much by itself (it isn't!) there's an online tutorial which will walk you through the whole thing.
And if getting the paper from the original link is awkward because you are not an ACM member, you'll find that the tutorial contains all the details anyway. (IMHO, for the price, the paper itself is waaaaay worth it).
10 pages!
I would not recommend starting with C as the language to implement, nor with any of the compiler-generator or parser-generator tools. C is a very tricky language, and it's probably a better idea to just make up a language of your own. It can be a little C-like (e.g. use curly backets if you want to indicate the function body, use the same type names, so you don't have to remember what you called everything).
The tools for making compilers and parsers are great, but have the problem of really being a shorthand notation. If you don't know how to create a compiler in longhand, the shorthand will seem cryptic, needlessly restrictive etc. So write your own simple compiler first, then continue on from there. I also recommend you don't start generating actual machine code unless you eat and breathe assembler. Create your own bytecode interpreter with a VM.
As to what language you should use to create your first compiler: It doesn't really matter, as long as the language is fairly complete. You will be reading input text, building data structures from them and writing out binary data. So if a language makes those things easier in any way, that's a point in favor of it. Pick a language you know well, so you can focus on creating the compiler, not learning the language. I usually use an OO language, which makes the syntax tree easier to write, a functional language would probably also work if you are familiar with that.
I've blogged a lot about programming languages, so you might find some useful postings here: http://orangejuiceliberationfront.com/category/language-design/
In particular, http://orangejuiceliberationfront.com/how-to-write-a-compiler/ is a starter on the particulars of parsing common constructs and generating something useful from that, as well as http://orangejuiceliberationfront.com/generating-machine-code-at-runtime/ which talks about actually spitting out Intel instructions that do something.
Oh, regarding bootstrapping of a compiler: You probably won't be able to do that right from the start. There is a fair amount of work involved in creating a compiler. So not only would writing a bootstrapping compiler involve writing the compiler (in some other language), once you have it, you would then have to write a second version of the compiler using itself. That's twice the work, plus the debugging needed in the existing and the bootstrapped new compiler until it all works. That said, once you have a working compiler, it is a good way to test its completeness. OK, maybe not twice the work, but more work. I'd go for the easy successes first, then move on from there.
In any event, have fun!

Why C is the language of compilers- when a Scheme subset would seem to be a better fit?

I was just listening to episode 57 of Software Engineering Radio
(TRANSCRIPT: http://www.se-radio.net/transcript-57-compiletime-metaprogramming )
I'm only 40 minutes in, but I'm wondering why C is the language of compilers- when a Scheme subset would seem to be a better fit? (or some other HLL)
(excluding the obvious reason of not wanting to rewrite gcc)
PS originally posted this at LtU http://lambda-the-ultimate.org/node/3754
I won't bother to listen to 40 minutes of radio to perhaps understand your question more thoroughly, but I would claim the exact opposite. Only a minority of compilers are written in C. I rather have the impression that (at least where appropriate), most compilers are implemented in the language they are meant to compile.
C need not be the language for compilers, but it does have some advantages. C is available on almost all platforms and that makes it easy to port and bootstrap the compiler. C is closer to the hardware and makes possible many optimizations that will be difficult to achieve in other languages. It is easy for a compiler written in C to co-exist with other languages, libraries and systems as most of them provide a C interface. It is also easy for others to extend the compiler as C is the Esperanto of system programmers.
Well, one reason will be the issue of bootstrapping the compiler on unsupported architectures. That will usually require the existence of a working compiler for that architecture, which generally means C. I remember trying to compile MIT-scheme from source, and getting really pissed off that it required MIT-scheme to be installed before I could build MIT-scheme.
Incidentally, I'm not sure I agree with your premise... C certainly seems to be the most widely deployed language, but other language compilers (e.g. MIT-scheme) are often implemented in those languages.
It's probably a combination of factors:
C compilers are available for almost every platform, making it easier to build a new compiler for a new language.
History: C is a very popular language, so it makes sense that a lot of projects are in C (no matter the project).
Scheme, specifically, is very unpopular (compared to C).
C has Flex and Yacc which help with implementing the Frontend (parser and lexer) of a compiler, if I remember right their output is limited to c code
Many compilers today are written in languages other than C (such as Scheme). To make them portable they initially generate C code as a target language.
I think a lot has to do with backends. Someone mentioned Flex and Yacc, but there's also GCC and LLVM that will help you with a lot of other important stuff, like optimizations.

What is the good approach to build a new compiler?

I have an experience about the compiler phrases and I interested in Programming Languages & Compilers field and I hope somebody gives me some explanation about what is the good approach to write a new compiler from scratch for a new programming language ? (I mean STEPS).
The first step is to read the Dragon Book.
It offers a good introduction to the whole field of compiler building, but also goes into enough detail to actually build your own.
As for the following steps I suggest following the chapters of the book. It's not written as a tutorial, but nevertheless offers much practical advice, which makes it an ideal hub for your own ideas and research.
Please don't use the Dragon Book, it's old and mostly outdated (and uses weird names for most of the stuff).
For books, I'd recommand Apple's Tiger Book, or Cooper's Engineering a compiler. I'd strongly suggest you to use a framework like llvm so you don't have to re-implement a bunch of stuff for code generation etc.
Here is the tutorial for building your language with llvm: http://llvm.org/docs/tutorial/
I would look at integrating your langauge/front end with the GNU compiler framework.
That way you only (ONLY!) need to write the parser and translator to gcc's portable object format. You get the optimiser, object code generation for the chip of choice, linker etc for free.
Another alternative would be to target a Java JVM, the virtual machine is well documented and the JVM instruction set is much more staighforward than x86 machine code.
I managed to write a compiler without any particular book (though I had read some compiler books in the past, just not in any real detail).
The first thing you should do is play with any of the "Compiler compiler" type tools (flex, bison, antlr, javacc) and get your grammar working. Grammars are mostly straightforward, but there's always nitty bits that get in the way and make a ruin of everything. Especially things like expressions, precedence, etc.
Some of the older simpler language are simpler for a reason. It makes the parsers "Just Work". Consider a Pascal variant that can be processed solely through recursive decent.
I mention this because without your grammar, you have no language. If you can't parse and lex it properly, you get nowhere very fast. And watching a dozen lines of sample code in your new language get turned in to a slew of tokens and syntax nodes is actually really amazing. In a "wow, it really works" kind of way. It's literally almost an "it all works" or "none of it works" kind of thing, especially at the beginning. Once it actually works, you feel like you might be able to really pull it off.
And to some extent that's true, because once you get that part done, you have to get your fundamental runtime going. Once you get "a = 1 + 1" compiled, the bulk of the new work is behind your and now you just need to implement the rest of the operators. It basically becomes an exercise of managing lookup tables and references, and having some idea where you are at any one time in the process.
You can run out on your own with a brand new syntax, innovative runtime, etc. But if you have the time, it's probably best to do a language that's already been done, just to understand and implement all of the steps, and think about if you were writing the language you really want, how you would do what you're doing with this existing one differently.
There are a lot of mechanics to compiler writing and just doing the process successfully once will give you a lot more confidence when you want to come back and do it again with your own, new language.

Resources