logo
down
shadow

writing extra data into io.Writer


writing extra data into io.Writer

By : user2956389
Date : November 22 2020, 10:56 AM
will help you If you know upfront the bytes you need to add in the stream (whatever the stream, image or not), building a Writer object for this specific purpose is not really hard.
Here is an example:
code :
package main

import (
    "fmt"
    "image"
    "image/draw"
    "image/jpeg"
    "image/color"
    "os"
    "io"
)

type ByteInserter struct {
    n   int
    b   []byte
    pos int
    w   io.Writer
}

func NewByteInserter(w io.Writer) *ByteInserter {
    return &ByteInserter{w:w}
}

func (bi* ByteInserter) Set( b[]byte, pos int ) {
    bi.b = b
    bi.pos = pos
}

func (bi *ByteInserter) Write( p []byte ) (n int, err error) {
    if bi.n>bi.pos || bi.n+len(p)<=bi.pos {
        n, err = bi.w.Write(p)
        bi.n += n
    } else {
        cut := bi.pos-bi.n
        if cut > 0 {
            n,err = bi.w.Write(p[:cut])
            bi.n += n
            if err != nil {
                return
            }
        }
        _,err = bi.w.Write(bi.b)
        if err != nil {
            return
        }
        n2 := 0
        n2, err = bi.w.Write(p[cut:])
        bi.n += n2
        n += n2
    }
    return
}

func main() {

    // Blue rectangle, stolen from Nigel Tao's post
    // http://blog.golang.org/go-imagedraw-package
    img := image.NewRGBA(image.Rect(0, 0, 640, 480))
    blue := color.RGBA{0, 0, 255, 255}
    draw.Draw(img, img.Bounds(), &image.Uniform{blue}, image.ZP, draw.Src)

    file, err := os.Create("file.jpg")
    if err != nil {
        panic(err)
    }

    bi := NewByteInserter(file)
    bi.Set( []byte("XXX"), 2 )   // Three bytes added at position 2
    jpeg.Encode(bi,img,nil)
    file.Close()

    fmt.Println("Written!")
}


Share : facebook icon twitter icon
Binary writer inserts extra character in writing

Binary writer inserts extra character in writing


By : sampolly1992
Date : March 29 2020, 07:55 AM
will be helpful for those in need The characters it is writing is the encoded length of the string, because: it is not expecting to write a text file. BinaryWriter writes... binary. It is writing the string in an encoded binary way that it can read back with BinaryReader.ReadString.
Instead, use just:
code :
File.WriteAllText(path, strCXML2);
Extra data when writing to the console after pulling data from file

Extra data when writing to the console after pulling data from file


By : Jeff Umscheid
Date : March 29 2020, 07:55 AM
With these it helps You are still writing out 94 items, it's just that the 94th item has a newline in it, so it will result in 95 lines written. So a file that looks like
code :
var values = File.ReadLines(path).SelectMany(line=>line.Split(','));
Does Spark Structured Streaming Kafka Writer supports writing data to particular partition?

Does Spark Structured Streaming Kafka Writer supports writing data to particular partition?


By : Dan
Date : March 29 2020, 07:55 AM
I hope this helps you . The keys determine which partition to write to - no, you can't hard-code a partition value within Spark's write methods.
Spark does allow you to configure kafka.partitioner.class, though, which would allow you to define the partition number based on the keys of the data
FileChannel and ByteBuffer writing extra data

FileChannel and ByteBuffer writing extra data


By : evhernandez23
Date : March 29 2020, 07:55 AM
Hope this helps I am creating a method that will take in a file and split it into shardCount pieces and generate a parity file. , Here is the solution I found (though it may need more tuning):
code :
public static void splitAndGenerateParityFile(File file, int shardCount, String fileID) throws IOException {
    int BUFFER_SIZE = 4 * 1024 * 1024;
    RandomAccessFile rin = new RandomAccessFile(file, "r");
    FileChannel fcin = rin.getChannel();

    //Create parity files
    File parity = new File(fileID + "_parity");
    if (parity.exists()) throw new FileAlreadyExistsException("Could not create parity file! File already exists!");
    RandomAccessFile parityRAF = new RandomAccessFile(parity, "rw");
    FileChannel parityOut = parityRAF.getChannel();

    //Create shard files
    ArrayList<File> shards = new ArrayList<>(shardCount);
    for (int i = 0; i < shardCount; i++) {
        File f = new File(fileID + "_part_" + i);
        if (f.exists()) throw new FileAlreadyExistsException("Could not create shard file! File already exists!");
        shards.add(f);
    }

    long bytesPerFile = (long) Math.ceil(rin.length() / shardCount);

    ArrayList<ByteBuffer> shardBuffers = new ArrayList<>(shardCount);

    //Make buffers for each section of the file we will be reading from
    for (int i = 0; i < shardCount; i++) {
        ByteBuffer bb = ByteBuffer.allocate(BUFFER_SIZE);
           shardBuffers.add(bb);
    }

    ByteBuffer parityBuffer = ByteBuffer.allocate(BUFFER_SIZE);

    //Generate parity
    boolean isParityBufferEmpty = true;
    for (long i = 0; i < bytesPerFile; i++) {
        isParityBufferEmpty = false;
        int pos = (int) (i % BUFFER_SIZE);
        byte p = 0;

        if (pos == 0) {
            //Read chunk of file into each buffer
            for (int j = 0; j < shardCount; j++) {
                ByteBuffer bb = shardBuffers.get(j);
                bb.clear();
                fcin.position(bytesPerFile * j + i);
                fcin.read(bb);
                bb.flip();
            }

            //Dump parity buffer
            if (i > 0) {
                parityBuffer.flip();
                while (parityBuffer.hasRemaining()) {
                    parityOut.write(parityBuffer);
                }
                parityBuffer.clear();
                isParityBufferEmpty = true;
            }
        }

        //Get parity
        for (ByteBuffer bb : shardBuffers) {
            if (!bb.hasRemaining()) break;
            p ^= bb.get();
        }

        //Put parity in buffer
        parityBuffer.put(p);
    }

    if (!isParityBufferEmpty) {
        parityBuffer.flip();
        parityOut.write(parityBuffer);
        parityBuffer.clear();
    }

    fcin.close();
    rin.close();
    parityOut.close();
    parityRAF.close();
}
What is the benefit of using a TextReader/Writer instead of a Binary Reader/Writer for string data?

What is the benefit of using a TextReader/Writer instead of a Binary Reader/Writer for string data?


By : Thomas Hammerl
Date : March 29 2020, 07:55 AM
Hope this helps No there are a bucket load of other differences, e.g. read and writeline, encoding... Basically loads of 'helper' functions that pertain to text.
shadow
Privacy Policy - Terms - Contact Us © ourworld-yourmove.org