1
0
mirror of https://github.com/titanscouting/tra-analysis.git synced 2025-09-22 21:37:49 +00:00
Files
apps
data analysis
website
functions
node_modules
.bin
@firebase
@google-cloud
@grpc
@mrmlnc
@nodelib
@protobufjs
@types
accepts
acorn
acorn-es7-plugin
ajv
ansi-regex
arr-diff
arr-flatten
arr-union
array-filter
array-flatten
array-union
array-uniq
array-unique
arrify
ascli
asn1
assert-plus
assign-symbols
async
asynckit
atob
aws-sign2
aws4
axios
balanced-match
base
bcrypt-pbkdf
body-parser
brace-expansion
braces
buffer-equal-constant-time
buffer-from
bun
bytebuffer
bytes
cache-base
call-me-maybe
call-signature
camelcase
capture-stack-trace
caseless
class-utils
cliui
code-point-at
collection-visit
colour
combined-stream
lib
License
Readme.md
package.json
component-emitter
compressible
concat-map
concat-stream
configstore
content-disposition
content-type
cookie
cookie-signature
copy-descriptor
core-js
core-util-is
cors
create-error-class
crypto-random-string
dashdash
debug
decamelize
decode-uri-component
deep-equal
define-properties
define-property
delayed-stream
depd
destroy
diff-match-patch
dir-glob
dom-storage
dot-prop
duplexify
eastasianwidth
ecc-jsbn
ecdsa-sig-formatter
ee-first
empower
empower-core
encodeurl
end-of-stream
ent
escape-html
espurify
estraverse
etag
expand-brackets
express
extend
extend-shallow
extglob
extsprintf
fast-deep-equal
fast-glob
fast-json-stable-stringify
faye-websocket
fill-range
finalhandler
firebase-admin
firebase-functions
follow-redirects
for-in
forever-agent
form-data
forwarded
fragment-cache
fresh
fs.realpath
functional-red-black-tree
gcp-metadata
gcs-resumable-upload
get-value
getpass
glob
glob-parent
glob-to-regexp
globby
google-auth-library
google-auto-auth
google-gax
google-p12-pem
google-proto-files
graceful-fs
grpc
gtoken
har-schema
har-validator
has-value
has-values
hash-stream-validation
http-errors
http-parser-js
http-signature
iconv-lite
ignore
imurmurhash
indexof
inflight
inherits
invert-kv
ipaddr.js
is
is-accessor-descriptor
is-buffer
is-data-descriptor
is-descriptor
is-extendable
is-extglob
is-fullwidth-code-point
is-glob
is-number
is-obj
is-plain-object
is-stream-ended
is-typedarray
is-windows
isarray
isobject
isstream
jsbn
json-schema
json-schema-traverse
json-stringify-safe
jsonwebtoken
jsprim
jwa
jws
kind-of
lcid
lodash
lodash.camelcase
lodash.clone
lodash.includes
lodash.isboolean
lodash.isinteger
lodash.isnumber
lodash.isplainobject
lodash.isstring
lodash.merge
lodash.once
log-driver
long
lru-cache
make-dir
map-cache
map-visit
media-typer
merge-descriptors
merge2
methmeth
methods
micromatch
mime
mime-db
mime-types
minimatch
mixin-deep
modelo
ms
nan
nanomatch
negotiator
node-forge
number-is-nan
oauth-sign
object-assign
object-copy
object-keys
object-visit
object.pick
on-finished
once
optjs
os-locale
parseurl
pascalcase
path-dirname
path-is-absolute
path-to-regexp
path-type
performance-now
pify
posix-character-classes
power-assert
power-assert-context-formatter
power-assert-context-reducer-ast
power-assert-context-traversal
power-assert-formatter
power-assert-renderer-assertion
power-assert-renderer-base
power-assert-renderer-comparison
power-assert-renderer-diagram
power-assert-renderer-file
power-assert-util-string-width
process-nextick-args
protobufjs
proxy-addr
pseudomap
psl
pump
pumpify
punycode
qs
range-parser
raw-body
readable-stream
regex-not
repeat-element
repeat-string
request
resolve-url
ret
retry-axios
retry-request
safe-buffer
safe-regex
safer-buffer
send
serve-static
set-value
setprototypeof
signal-exit
slash
snakeize
snapdragon
snapdragon-node
snapdragon-util
source-map
source-map-resolve
source-map-url
split-array-stream
split-string
sshpk
static-extend
statuses
stream-events
stream-shift
string-format-obj
string-width
string_decoder
stringifier
strip-ansi
stubs
through2
to-object-path
to-regex
to-regex-range
tough-cookie
traverse
tslib
tunnel-agent
tweetnacl
type-is
type-name
typedarray
union-value
unique-string
universal-deep-strict-equal
unpipe
unset-value
uri-js
urix
use
util-deprecate
utils-merge
uuid
vary
verror
websocket-driver
websocket-extensions
window-size
wrap-ansi
wrappy
write-file-atomic
xdg-basedir
xmlhttprequest
xtend
y18n
yallist
yargs
index.js
package-lock.json
package.json
node_modules
public
.firebaserc
.gitignore
.runtimeconfig.json
firebase-debug.log
firebase.json
firestore.indexes.json
firestore.rules
package-lock.json
.gitattributes
.gitignore
tra-analysis/website/functions/node_modules/combined-stream
2019-01-06 13:14:45 -06:00
..
2019-01-06 13:14:45 -06:00
2019-01-06 13:14:45 -06:00
2019-01-06 13:14:45 -06:00
2019-01-06 13:14:45 -06:00

combined-stream

A stream that emits multiple other streams one after another.

NB Currently combined-stream works with streams version 1 only. There is ongoing effort to switch this library to streams version 2. Any help is welcome. :) Meanwhile you can explore other libraries that provide streams2 support with more or less compatibility with combined-stream.

  • combined-stream2: A drop-in streams2-compatible replacement for the combined-stream module.

  • multistream: A stream that emits multiple other streams one after another.

Installation

npm install combined-stream

Usage

Here is a simple example that shows how you can use combined-stream to combine two files into one:

var CombinedStream = require('combined-stream');
var fs = require('fs');

var combinedStream = CombinedStream.create();
combinedStream.append(fs.createReadStream('file1.txt'));
combinedStream.append(fs.createReadStream('file2.txt'));

combinedStream.pipe(fs.createWriteStream('combined.txt'));

While the example above works great, it will pause all source streams until they are needed. If you don't want that to happen, you can set pauseStreams to false:

var CombinedStream = require('combined-stream');
var fs = require('fs');

var combinedStream = CombinedStream.create({pauseStreams: false});
combinedStream.append(fs.createReadStream('file1.txt'));
combinedStream.append(fs.createReadStream('file2.txt'));

combinedStream.pipe(fs.createWriteStream('combined.txt'));

However, what if you don't have all the source streams yet, or you don't want to allocate the resources (file descriptors, memory, etc.) for them right away? Well, in that case you can simply provide a callback that supplies the stream by calling a next() function:

var CombinedStream = require('combined-stream');
var fs = require('fs');

var combinedStream = CombinedStream.create();
combinedStream.append(function(next) {
  next(fs.createReadStream('file1.txt'));
});
combinedStream.append(function(next) {
  next(fs.createReadStream('file2.txt'));
});

combinedStream.pipe(fs.createWriteStream('combined.txt'));

API

CombinedStream.create([options])

Returns a new combined stream object. Available options are:

  • maxDataSize
  • pauseStreams

The effect of those options is described below.

combinedStream.pauseStreams = true

Whether to apply back pressure to the underlaying streams. If set to false, the underlaying streams will never be paused. If set to true, the underlaying streams will be paused right after being appended, as well as when delayedStream.pipe() wants to throttle.

combinedStream.maxDataSize = 2 * 1024 * 1024

The maximum amount of bytes (or characters) to buffer for all source streams. If this value is exceeded, combinedStream emits an 'error' event.

combinedStream.dataSize = 0

The amount of bytes (or characters) currently buffered by combinedStream.

combinedStream.append(stream)

Appends the given stream to the combinedStream object. If pauseStreams is set to `true, this stream will also be paused right away.

streams can also be a function that takes one parameter called next. next is a function that must be invoked in order to provide the next stream, see example above.

Regardless of how the stream is appended, combined-stream always attaches an 'error' listener to it, so you don't have to do that manually.

Special case: stream can also be a String or Buffer.

combinedStream.write(data)

You should not call this, combinedStream takes care of piping the appended streams into itself for you.

combinedStream.resume()

Causes combinedStream to start drain the streams it manages. The function is idempotent, and also emits a 'resume' event each time which usually goes to the stream that is currently being drained.

combinedStream.pause();

If combinedStream.pauseStreams is set to false, this does nothing. Otherwise a 'pause' event is emitted, this goes to the stream that is currently being drained, so you can use it to apply back pressure.

combinedStream.end();

Sets combinedStream.writable to false, emits an 'end' event, and removes all streams from the queue.

combinedStream.destroy();

Same as combinedStream.end(), except it emits a 'close' event instead of 'end'.

License

combined-stream is licensed under the MIT license.