• 4
name Punditsdkoslkdosdkoskdo

Clump-convert records for encoding

How can I batch-convert files in a directory for their encoding (e.g. ANSI->UTF-8) with a command or tool?

For single files an editor helps, but how to do the mass files job?

UTFCast is a Unicode converter for Windows which supports batch mode. I'm using the paid version and am quite comfortable with it.

UTFCast is a Unicode converter that lets you batch convert all text files to UTF encodings with just a click of your mouse. You can use it to convert a directory full of text files to UTF encodings including UTF-8, UTF-16 and UTF-32 to an output directory, while maintaining the directory structure of the original files. It doesn't even matter if your text file has a different extension, UTFCast can automatically detect text files and convert them.

  • 0
Reply Report
%  get-content IN.txt | out-file -encoding ENC -filepath OUT.txt

while ENC is something like unicode, ascii, utf8, utf32. checkout 'help out-file'.

to convert all the *.txt files in a directory to utf8 do something like this:

% foreach($i in ls -name DIR/*.txt) { 
       get-content DIR/$i | 
       out-file -encoding utf8 -filepath DIR2/$i 

which creates a converted version of each .txt file in DIR2.

EDIT: To replace the files in all subdirectories use:

% foreach($i in ls -recurse -filter "*.java") {
    $temp = get-content $i.fullname
    out-file -filepath $i.fullname -inputobject $temp -encoding utf8 -force
  • 0
Reply Report