How to calculate checksum of a file in GO - file

I need calculate checksum of a file to determine the integrity of data of existing file. I need it for large files to avoid the download. Can you give me any idea?

You can do that by :
f, err := os.Open(path)
if err != nil {
glog.Fatal(err)
}
defer f.Close()
hasher := sha256.New()
if _, err := io.Copy(hasher, f); err != nil {
glog.Fatal(err)
}
value:= hex.EncodeToString(hasher.Sum(nil))

Related

How to read a word in upper and lower case from file? Go

I have a task to read a file and, then scan for "bad word", and finally output "True" or "False" if "bad word" contains in file or not.
I wrote this function:
func readFile(fileName string, tabooName string) {
dataFile, err := os.Open(fileName)
if err != nil {
log.Fatal(err)
}
defer dataFile.Close()
scanner := bufio.NewScanner(dataFile)
for scanner.Scan() {
scanner.Text()
if strings.Contains(scanner.Text(), strings.ToLower(tabooName)) || strings.Contains(scanner.Text(), strings.ToUpper(tabooName)) {
fmt.Println("True")
break
} else {
fmt.Println("False")
break
}
}
if err := scanner.Err(); err != nil {
log.Fatal(err)
}
}
But it does not work if the "bad word" is written in UPPERCASE
For example:
Wrong answer in test #2
This word belongs to bad words
Please find below the output of your program during this failed test.
Note that the '>' character indicates the beginning of the input line.
---
forbidden_words.txt
HARSH
False
How can i do it corectly?
UPDATE: I don't know why but this function is working now...
First, i justed wrote this whole func in main function, and it's working, now with func it's working too. I don't know why lol
You could use strings.Contains and strings.ToLower slightly differently:
func readFile(fileName string, tabooName string) {
file, err := os.Open(fileName)
if err != nil {
log.Fatal(err)
}
defer file.Close()
tabooNameLowered := strings.ToLower(tabooName)
scanner := bufio.NewScanner(file)
for scanner.Scan() {
loweredLine := strings.ToLower(scanner.Text())
if strings.Contains(loweredLine, tabooNameLowered) {
fmt.Println("True")
return
}
}
fmt.Println("False")
if err := scanner.Err(); err != nil {
log.Fatal(err)
}
}

Cannot cypher more than 4096 bytes using AES encryption in go

I am building a program which has some large strings I want to be cyphered (so they cannot just be read from the binary). To do so, I am trying to save the strings cyphered with AES encryption and decypher them in runtime.
I have been looking into how to do this and managed to cypher some small texts, but it seems my cypher program cannot cypher more than 4096 bytes of text, as it cuts the text when it reaches this length. I was wondering if there was some way to increment the buffer the AES Go module uses or another way around this problem.
This is the code I am using to cypher the strings
package main
import (
"io"
"os"
"fmt"
"bufio"
"strings"
"crypto/aes"
"crypto/rand"
"crypto/cipher"
"encoding/base64"
)
func encrypt(text, keyText string) string {
key := []byte(keyText)
plaintext := []byte(text)
block, err := aes.NewCipher(key)
if err != nil {
panic(err)
}
ciphertext := make([]byte, aes.BlockSize+len(plaintext))
iv := ciphertext[:aes.BlockSize]
if _, err := io.ReadFull(rand.Reader, iv); err != nil {
panic(err)
}
stream := cipher.NewCFBEncrypter(block, iv)
stream.XORKeyStream(ciphertext[aes.BlockSize:], plaintext)
return base64.URLEncoding.EncodeToString(ciphertext)
}
func main() {
fmt.Println("Encryption Program v1.0.0 ~ by: Sam Sepiol")
fmt.Print("[Encryption Key (32 bytes)]>>: ")
in := bufio.NewReader(os.Stdin)
plaintextKey, err := in.ReadString('\n')
if err != nil {
print("Error reading line\n", 9)
}
plaintextKey = strings.Replace(plaintextKey, "\n", "", -1) // strip chars from line
for {
fmt.Print(">> ")
in := bufio.NewReader(os.Stdin)
plaintext, err := in.ReadString('\n')
if err != nil {
print("Error reading line\n", 9)
}
plaintext = strings.Replace(plaintext, "\n", "", -1) // strip chars from line
result := encrypt(plaintext, plaintextKey)
fmt.Printf("\n\n%s\n\n", result)
}
}

How to remove short lines of text from a file

I am trying to write a Go program that will read a .txt file and will remove all lines that are shorter than the specified amount.
I am trying to do it as I am reading the file, it actually does find all the lines that are 2 symbols long, but doesn't remove them.
scanner := bufio.NewScanner(file)
var bs []byte
buf := bytes.NewBuffer(bs)
var text string
for scanner.Scan() {
text = scanner.Text()
length := len(text)
if length < 3 {
_, err := buf.WriteString("\n")
if err != nil {
exit("Couldn't replace line")
}
}
}
Your program is missing two things, first you should write the contents of the line to buf, so _, err := buf.WriteString("\n") could be rewritten as:
_, err := buf.WriteString(text + "\n")
The second thing is the removal code, the simplest approach could be to use both Seek and Truncate before dumping the contents of buf to the file handler:
// Reset the file size
f.Truncate(0)
// Position on the beginning of the file:
f.Seek(0, 0)
// Finally write the contents of buf into the file:
buf.WriteTo(f)
The full program would look like:
package main
import (
"bufio"
"bytes"
"os"
)
func main() {
f, err := os.OpenFile("input.txt", os.O_RDWR, 0644)
if err != nil {
panic(err)
}
scanner := bufio.NewScanner(f)
var bs []byte
buf := bytes.NewBuffer(bs)
var text string
for scanner.Scan() {
text = scanner.Text()
length := len(text)
if length < 3 {
_, err := buf.WriteString(text + "\n")
if err != nil {
panic("Couldn't replace line")
}
}
}
f.Truncate(0)
f.Seek(0, 0)
buf.WriteTo(f)
}

base64 decoder (io.Reader implementation) misbehaviour

I have tried, within a for loop, to re-declare/assign a base64 decoder and used the os.Seek function to go back to the beginning of the file at the end of the loop before this, in order for the called function (in this test case PrintBytes) to be able to process the file from beginning to end time and time again throughout the for loop.
Here is my (I'm sure terribly un-idiomatic) code, which fails to read the 2nd byte into the []byte of length 2 and capacity 2 during the second iteration of the main for loop in main():
package main
import (
"encoding/base64"
"io"
"log"
"net/http"
"os"
)
var (
remote_file string = "http://cryptopals.com/static/challenge-data/6.txt"
local_file string = "secrets_01_06.txt"
)
func main() {
f, err := os.Open(local_file)
if err != nil {
DownloadFile(local_file, remote_file)
f, err = os.Open(local_file)
if err != nil {
log.Fatal(err)
}
}
defer f.Close()
for blocksize := 1; blocksize <= 5; blocksize++ {
decoder := base64.NewDecoder(base64.StdEncoding, f)
PrintBytes(decoder, blocksize)
_, err := f.Seek(0, 0)
if err != nil {
log.Fatal(err)
}
}
}
func PrintBytes(reader io.Reader, blocksize int) {
block := make([]byte, blocksize)
for {
n, err := reader.Read(block)
if err != nil && err != io.EOF {
log.Fatal(err)
}
if n != blocksize {
log.Printf("n=%d\tblocksize=%d\tbreaking...", n, blocksize)
break
}
log.Printf("%x\tblocksize=%d", block, blocksize)
}
}
func DownloadFile(local string, url string) {
f, err := os.Create(local)
if err != nil {
log.Fatal(err)
}
defer f.Close()
resp, err := http.Get(url)
if err != nil {
log.Fatal(err)
}
defer resp.Body.Close()
_, err = io.Copy(f, resp.Body)
if err != nil {
log.Fatal(err)
}
}
The output from this code can be viewed here https://gist.github.com/tomatopeel/b8e2f04179c7613e2a8c8973a72ec085
It is this behaviour that I don't understand:
https://gist.github.com/tomatopeel/b8e2f04179c7613e2a8c8973a72ec085#file-bad_reader_log-L5758
I was expecting it to simply read the file 2 bytes at a time into the 2-byte slice, from beginning to end. For what reason does it only read 1 byte here?
It is not the problem of encoding/base64. When using io.Reader, it's not guaranteed that number of bytes read exactly equal to the buffer size (i.e. blocksize in your example code). The documentation states:
Read reads up to len(p) bytes into p. It returns the number of bytes read (0 <= n <= len(p)) and any error encountered. Even if Read returns n < len(p), it may use all of p as scratch space during the call. If some data is available but not len(p) bytes, Read conventionally returns what is available instead of waiting for more.
In your example, change PrintBytes to
func PrintBytes(reader io.Reader, blocksize int) {
block := make([]byte, blocksize)
for {
n, err := reader.Read(block)
//Process the data if n > 0, even when err != nil
if n > 0 {
log.Printf("%x\tblocksize=%d", block[:n], blocksize)
}
//Check for error
if err != nil {
if err != io.EOF {
log.Fatal(err)
} else if err == io.EOF {
break
}
} else if n == 0 {
//Considered as nothing happened
log.Printf("WARNING: read return 0,nil")
}
}
}
Update:
Correct usage of io.Reader, modify code to always process the data if n > 0 even when error occurs.

Golang Save File from URL to GCS Bucket in AppEngine

I'm trying to get a image file from URL and save it to GCS storage using the documentation here.
I can't figure out how to create the file from response.Body to GSC given my AppEngine environment... I can't create using os.Create and reference a file path in this environment, right?
Looking for tips on passing the response.Body to the wc.Write() method while deployed in AppEngine.
Code to get file:
func main() {
url := "http://i.imgur.com/m1UIjW1.jpg"
// don't worry about errors
response, e := http.Get(url)
if e != nil {
log.Fatal(e)
}
defer response.Body.Close()
//open a file for writing
file, err := os.Create("/tmp/asdf.jpg")
if err != nil {
log.Fatal(err)
}
// Use io.Copy to just dump the response body to the file. This supports huge files
_, err = io.Copy(file, response.Body)
if err != nil {
log.Fatal(err)
}
file.Close()
fmt.Println("Success!")
}
Save a local file to GCS
func (d *demo) createFile(fileName string) {
fmt.Fprintf(d.w, "Creating file /%v/%v\n", d.bucketName, fileName)
wc := d.bucket.Object(fileName).NewWriter(d.ctx)
wc.ContentType = "text/plain"
wc.Metadata = map[string]string{
"x-goog-meta-foo": "foo",
"x-goog-meta-bar": "bar",
}
d.cleanUp = append(d.cleanUp, fileName)
if _, err := wc.Write([]byte("abcde\n")); err != nil {
d.errorf("createFile: unable to write data to bucket %q, file %q: %v", d.bucketName, fileName, err)
return
}
if _, err := wc.Write([]byte(strings.Repeat("f", 1024*4) + "\n")); err != nil {
d.errorf("createFile: unable to write data to bucket %q, file %q: %v", d.bucketName, fileName, err)
return
}
if err := wc.Close(); err != nil {
d.errorf("createFile: unable to close bucket %q, file %q: %v", d.bucketName, fileName, err)
return
}
}
It seems like you are looking for a Writer and a bytes.Buffer:
var buff bytes.Buffer
writer := NewWriter(buff)
n, err := io.Copy(writer, response.Body)

Resources