Page 1 of 1

Streamwriter/reader problem.

PostPosted: 19 Feb 2014, 15:23
by Leeizazombie
I'm trying to make a command to fully remove a custom command from a server and I have a problem removing the line from cmdautoload.txt, my attempt is:

Code: Select all
string line = null;
            string line_to_delete = message;

            using (StreamReader reader = new StreamReader(Environment.CurrentDirectory + "/text/cmdautoload.txt"))
            {
                using (StreamWriter writer = new StreamWriter(Environment.CurrentDirectory + "/text/cmdautoload.txt"))
                {
                    while ((line = reader.ReadLine()) != null)
                    {
                        if (String.Compare(line, line_to_delete) == 0)
                            continue;

                        writer.WriteLine(line);
                    }
                }
            }


Error:
Code: Select all
-------------------------

----19/02/2014 14:15:55 ----
Type: IOException
Source: mscorlib
Message: The process cannot access the file 'C:\Users\Leebyrne115\Desktop\Zombie Server - Copy\text\cmdautoload.txt' because it is being used by another process.
Target: WinIOError
Trace:    at System.IO.__Error.WinIOError(Int32 errorCode, String maybeFullPath)
   at System.IO.FileStream.Init(String path, FileMode mode, FileAccess access, Int32 rights, Boolean useRights, FileShare share, Int32 bufferSize, FileOptions options, SECURITY_ATTRIBUTES secAttrs, String msgPath, Boolean bFromProxy)
   at System.IO.FileStream..ctor(String path, FileMode mode, FileAccess access, FileShare share, Int32 bufferSize, FileOptions options)
   at System.IO.StreamWriter.CreateFile(String path, Boolean append)
   at System.IO.StreamWriter..ctor(String path, Boolean append, Encoding encoding, Int32 bufferSize)
   at System.IO.StreamWriter..ctor(String path)
   at MCDzienny.CmdCmdRemove.Use(Player p, String message)
   at MCDzienny.Player.<>c__DisplayClass28.<HandleCommand>b__23()

-------------------------


I don't understand how the file could already been in use, I had it closed, and doesn't it only be used at server start-up? any solutions?

Re: Streamwriter/reader problem.

PostPosted: 19 Feb 2014, 18:19
by joppiesaus
You can't simply write and read at the same time. You can't do that, so a computer can't. ;)

Here's a simple solution. I don't know it it's the most efficient one.
Code: Select all
string line_to_delete = message;
string[] lines;
int a = 0;
            using (StreamReader reader = new StreamReader(Environment.CurrentDirectory + "/text/cmdautoload.txt"))
            {
                // This may look confusing, but it's just get the total length of all the lines.
                lines = new string[reader.ReadToEnd().Split(new string[] { System.Environment.NewLine }, StringSplitOptions.None).Length];
                    while ((line = reader.ReadLine()) != null)
                    {
                        if (String.Compare(line, line_to_delete) == 0)
                            continue;

                            lines[a] = line;
                            a++;
                    }
               
            }

// Resize the array, so you won't get NULL REFERENCE DERP errors
Array.Resize(ref lines, a);

            using (StreamWriter writer = new StreamWriter(Environment.CurrentDirectory + "/text/cmdautoload.txt"))
            {
                  // Write all lines
                  for (int i = 0; i < lines.Length; i++)
                  {
                     writer.WriteLine(lines[i]);
                  }
            }

Re: Streamwriter/reader problem.

PostPosted: 19 Feb 2014, 18:36
by Leeizazombie
Ah, I see, thanks for the help <ok>

Re: Streamwriter/reader problem.

PostPosted: 26 Feb 2014, 10:16
by dzienny
You should use System.IO.File.ReadAllLines(string path):
Code: Select all
var lines = File.ReadAllLines(filePath);


StreamReader and StreamWriter classes are useful, if you deal with large files. If a file is 1GB then you probably don't want to load it to RAM. Other files may even exceed your RAM. Streams only load some portion of a file at a time. So they allow you to process extremly large files. But if a file size is relatively small, then it's fine to load it all to RAM.