1
0
Fork 0

Populate document metadata using Data APIs

What a cluster.

This was a lot of work to work around existing, bad APIs; there is no
time to refactor at the moment; this already took much longer than
expected.
master
Mike Gerwitz 2017-06-28 16:12:08 -04:00
parent 65ab92f701
commit 0c24e3d280
14 changed files with 918 additions and 63 deletions

View File

@ -203,6 +203,10 @@ Program@tie{}XML
Data@tie{}API
@end macro
@macro dapiref
@dapi (@pxref{Data API,,Data@tie{}API})
@end macro
@c todo: link to reference directly
@macro proguicref{ref}
`\ref\' @proguicrefsuffix

View File

@ -40,8 +40,11 @@ Programs are ideally compiled from a @ref{Program XML,,Program@tie{}XML}
@menu
* Program UI::
* Program XML::
* Document Metadata:: Document-level data that cannot be modified by
the client.
@end menu
@node Program UI
@section Program UI
@maintenance{
@ -220,3 +223,60 @@ Within the context of the @progxml,
It reads as a sentence:
``@samp{vacant_desc}'' is applicable when we should @tie{}``describe
a vacant property''.
@node Document Metadata
@section Document Metadata
@dfn{Document metadata} are metadata that describe certain aspects of the document;
they are stored adjacent to the bucket in @samp{meta}@tie{}on the
document root.@footnote{
Terminology note: ``document'' and ``quote'' are the same thing;
the latter is transitioning to the former for generality.}
They should be used in place of a bucket field any time
the client has no business knowing about the data.
The @samp{meta} record is called the @dfn{Metabucket}.
@c don't use a dapi xref here; don't want to confuse the reader by
@c directing them away from this section before they continue reading
@tip{Metadata in the Metabucket should@tie{}@emph{not} be
directly populated by external systems@mdash{
}@dapi integration should be used instead (see below).}
Metadata can be populated using any@tie{}@dapiref@mdash{
}return data populate the Metabucket in the same way that they
populate the Bucket.
Definitions are stored in @code{meta.fields},
as shown in @ref{f:meta-fields}.
@float Figure, f:meta-fields
@example
"fields":@{
["string(name)": @{
"desc": "string",
"dapi": @{
"name": "string",
"map": @{
"string(dest field)": "string(source field)"
@}
@}
@}
@}
@end example
@caption{Format of @code{meta.fields}.}
@end float
Further, a key-value mapping of all bucket fields that@mdash{
}when modified,
need to result in a metadata API@tie{}call@mdash{
}are stored in the @code{mapis}@tie{}object;
this is shown in @ref{f:mapis}.
@float Figure, f:mapis
@example
"mapis":@{
["string(field name)"]: [ "string(dapi name)", ... ]
@}
@end example
@caption{Format of @code{mapis}.}
@end float

View File

@ -56,9 +56,10 @@ The HTTP server is managed by
@menu
* Requests:: Handling HTTP requests
* Posting Data:: Handling step saves and other posts.
* Encryption Service:: Managing sensitive data.
* Requests:: Handling HTTP requests.
* Posting Data:: Handling step saves and other posts.
* Server-Side Data API Calls:: Accessing external resources on the server.
* Encryption Service:: Managing sensitive data.
@end menu
@ -202,6 +203,11 @@ Once those basic checks have passed,
re-calculated on the server (the values posted by the client have
already been discarded by the first step in this list);
@item
Server-side @dapi{} calls (@pxref{Data API}) are triggered using the
diff as input data and an empty bucket for response storage
(@pxref{Server-Side Data API Calls});
@item
@cindex Premium calculation date
The last premium calculation date is cleared (indicating that
@ -226,6 +232,37 @@ Once those basic checks have passed,
@node Server-Side Data API Calls
@section Server-Side Data API Calls
@maintenance{This makes use of @srcrefjs{server/meta,DapiMetaSource}
to encapsulate the horrible API of @srcrefjs{dapi,DataApiManager};
the latter needs cleanup to remove the former.}
@cindex Data API
@cindex Document metadata
Server-side @dapi{} calls (@pxref{Data API}) are triggered on
step save (@pxref{Posting Data}) and are handled much like they are
on the client.
Such calls are made automatically only for document metadata.
Results of sever-side calls are @emph{not} written to the bucket
and are therefore useful for data that the client should not be
permitted to modify;
it also allows data to be kept secret from the client.@footnote{
All bucket data is served to the client,
with the exception of internal fields if the user is non-internal.}
@dapi{} results on the client can be mapped back to multiple bucket values;
the server, however, has serious concerns with how data are
propagated for data integrity and security reasons.
Further,
document metadata can be structured,
unlike the Bucket which has a rigid matrix format (@pxref{Bucket}).
Therefore,
the entire response is mapped into the parent field;
defined return values are used only for filtering.
@node Encryption Service
@section Encryption Service
@helpwanted

View File

@ -22,7 +22,6 @@
const Class = require( 'easejs' ).Class;
const HttpDataApi = require( './http/HttpDataApi' );
const XhrHttpImpl = require( './http/XhrHttpImpl' );
const NodeHttpImpl = require( './http/NodeHttpImpl' );
const JsonResponse = require( './format/JsonResponse' );
const RestrictedDataApi = require( './RestrictedDataApi' );
const StaticAdditionDataApi = require( './StaticAdditionDataApi' );
@ -30,7 +29,7 @@ const BucketDataApi = require( './BucketDataApi' );
/**
* Instantiates the appropriate DataApi object for the givne service type
* Instantiates the appropriate DataApi object for the given service type
*/
module.exports = Class( 'DataApiFactory',
{
@ -58,15 +57,7 @@ module.exports = Class( 'DataApiFactory',
switch ( type )
{
case 'rest':
const impl = ( typeof XMLHttpRequest !== 'undefined' )
? XhrHttpImpl( XMLHttpRequest )
: NodeHttpImpl(
{
http: require( 'http' ),
https: require( 'https' ),
},
require( 'url' )
);
const impl = this.createHttpImpl();
api = HttpDataApi.use( JsonResponse )(
source,
@ -93,6 +84,12 @@ module.exports = Class( 'DataApiFactory',
StaticAdditionDataApi( api, nonempty, multiple, static_data ),
desc
);
}
},
'virtual protected createHttpImpl'()
{
return XhrHttpImpl( XMLHttpRequest );
},
} );

View File

@ -24,19 +24,24 @@ const { Class } = require( 'easejs' );
const {
bucket: {
bucket_filter,
QuoteDataBucket,
},
dapi: {
DataApiFactory,
DataApiManager,
},
server: {
Server,
meta: {
DapiMetaSource,
},
request: {
DataProcessor,
JsonServerResponse,
ServerDataApiFactory,
},
},
} = require( '../..' );
@ -51,6 +56,18 @@ module.exports = Class( 'DocumentServer',
new JsonServerResponse.create(),
dao,
logger,
enc_service
enc_service,
DataProcessor(
bucket_filter,
( apis, request ) => DataApiManager(
ServerDataApiFactory(
origin_url || request.getOrigin(),
request
),
apis
),
DapiMetaSource( QuoteDataBucket )
)
),
} );

View File

@ -48,6 +48,9 @@ const {
},
server: {
request: {
DataProcessor,
},
encsvc: {
QuoteDataBucketCipher,
},
@ -109,13 +112,27 @@ module.exports = Class( 'Server' )
*/
'private _cache': null,
/**
* Client-provided data processor
* @type {DataProcessor}
*/
'private _dataProcessor': null,
'public __construct': function( response, dao, logger, encsvc )
'public __construct': function(
response, dao, logger, encsvc, data_processor
)
{
this.response = response;
this.dao = dao;
this.logger = logger;
this._encService = encsvc;
if ( !Class.isA( DataProcessor, data_processor ) )
{
throw TypeError( "Expected DataProcessor" );
}
this.response = response;
this.dao = dao;
this.logger = logger;
this._encService = encsvc;
this._dataProcessor = data_processor;
},
@ -1115,14 +1132,19 @@ module.exports = Class( 'Server' )
{
try
{
var filtered = server._sanitizeBucketData(
post_data.data, request, program
var parsed_data = JSON.parse( post_data.data );
var bucket = quote.getBucket();
const { filtered, dapis } = server._dataProcessor.processDiff(
parsed_data, request, program, bucket
);
quote.setData( filtered );
server._monitorMetadataPromise( quote, dapis );
// calculated values (store only)
program.initQuote( quote.getBucket(), true );
program.initQuote( bucket, true );
}
catch ( err )
{
@ -1150,33 +1172,27 @@ module.exports = Class( 'Server' )
},
/**
* Sanitize the given bucket data
*
* Ensures that we are storing only "correct" data within our database. This
* also strips any unknown bucket values, preventing users from using us as
* their own personal database.
*/
'private _sanitizeBucketData': function(
bucket_data, request, program, permit_null
)
'private _monitorMetadataPromise'( quote, dapis )
{
var data = JSON.parse( bucket_data ),
types = program.meta.qtypes,
ignore = {};
// if we're not internal, filter out the internal questions
// (so they can't post to them)
if ( request.getSession().isInternal() === false )
{
for ( id in program.internal )
{
ignore[ id ] = true;
}
}
// return the filtered data
return bucket_filter.filter( data, types, ignore, permit_null );
dapis.map( promise => promise
.then( ( { field, index, data } ) =>
this.dao.saveQuoteMeta(
quote,
data,
null,
e => { throw e; }
)
)
.catch( e =>
server.logger.log(
server.logger.PRIORITY_ERROR,
"Failed to save field %s[%s] metadata: %s",
field,
index,
e.message
)
)
);
},
@ -1619,8 +1635,10 @@ module.exports = Class( 'Server' )
// sanitize, permitting nulls (since the diff will have them)
try
{
var filtered = _self._sanitizeBucketData(
post_data.data, request, program, true
var data = JSON.parse( post_data.data );
var filtered = _self._dataProcessor.sanitizeDiff(
data, request, program, true
);
}
catch ( e )

View File

@ -108,9 +108,8 @@ exports.init = function( logger, enc_service )
{ native_parser: false, safe: false }
);
var dao = MongoServerDao( db );
server = DocumentServer().create( dao, logger, enc_service );
const dao = MongoServerDao( db );
server = _createDocumentServer( dao, logger, enc_service );
server_cache = _createCache( server );
server.init( server_cache, exports.rater );
@ -149,6 +148,25 @@ exports.init = function( logger, enc_service )
}
function _createDocumentServer( dao, logger, enc_service )
{
const origin_url = process.env.HTTP_ORIGIN_URL || '';
if ( !origin_url )
{
// this allows the system to work without configuration (e.g. for
// local development), but is really bad
logger.log( logger.PRIORITY_IMPORTANT,
"*** HTTP_ORIGIN_URL environment variable not set; " +
"system will fall back to using the origin of HTTP requests, " +
"meaning an attacker can control where server-side requests go! ***"
);
}
return DocumentServer().create( dao, logger, enc_service, origin_url );
}
function _initExportService( db, callback )
{
db.collection( 'quotes', function( err, collection )

View File

@ -521,6 +521,38 @@ module.exports = Class( 'MongoServerDao' )
},
/**
* Save document metadata (meta field on document)
*
* Only the provided indexes will be modified (that is---data will be
* merged with what is already in the database).
*
* @param {Quote} quote destination quote
* @param {Object} new_meta bucket-formatted data to write
* @param {Function} success callback on success
* @param {Function} failure callback on error
*
* @return {undefined}
*/
'public saveQuoteMeta'( quote, new_meta, success, failure )
{
const update = {};
for ( var key in new_meta )
{
var meta = new_meta[ key ];
for ( var i in meta )
{
update[ 'meta.' + key + '.' + i ] =
new_meta[ key ][ i ];
}
}
this.mergeData( quote, update, success, failure );
},
/**
* Saves the quote lock state to the database
*

View File

@ -71,6 +71,13 @@ module.exports = Class( 'DapiMetaSource',
data,
( err, api_data ) =>
{
if ( api_data.length > 1 )
{
reject( Error(
"Data API request produced more than one result"
) );
}
dapi_manager.setFieldData(
dapi.name,
index,

View File

@ -0,0 +1,260 @@
/**
* Manages DataAPI requests and return data
*
* Copyright (C) 2017 R-T Specialty, LLC.
*
* This file is part of the Liza Data Collection Framework.
*
* liza is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as
* published by the Free Software Foundation, either version 3 of the
* License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program. If not, see <http://www.gnu.org/licenses/>.
*/
'use strict';
const { Class } = require( 'easejs' );
const { QuoteDataBucket } = require( '../../' ).bucket;
/**
* Process data provided by the client
*
* TOOD: This contains Data API and bucket merging logic that is better done
* elsewhere.
*/
module.exports = Class( 'DataProcessor',
{
/**
* Bucket filter
* @type {Object}
*/
'private _filter': null,
/**
* Construct Data API manager
* @type {function()}
*/
'private _dapif': null,
/**
* Metadata source
* @type {DapiMetaSource}
*/
'private _metaSource': null,
/**
* Initialize processor
*
* @type {Object} filter bucket filter
* @type {function()} dapif data API constructor
* @type {DapiMetaSource} meta_source metadata source
*/
constructor( filter, dapif, meta_source )
{
this._filter = filter;
this._dapif = dapif;
this._metaSource = meta_source;
},
/**
* Process client-provided data diff
*
* This performs sanitization to ensure that we are storing only
* "correct" data within our database. This also strips any unknown
* bucket values, preventing users from using us as their own personal
* database.
*
* @param {Object} data bucket diff data
* @param {UserRequest} request submitting request
* @param {Program} program active program
*
* @return {Object} processed diff
*/
'public processDiff'( data, request, program, bucket )
{
const filtered = this.sanitizeDiff( data, request, program, false );
const dapi_manager = this._dapif( program.apis, request );
// array of promises for any dapi requests
const dapis = this._triggerDapis(
dapi_manager, program, data, bucket
);
return {
filtered: filtered,
dapis: dapis,
};
},
/**
* Sanitize client-provided data
*
* Internal fields will be stripped if the session is not
* internal. Following that, the filter provided via the ctor will be
* applied.
*
* `permit_null` should be used only in the case of bucket diffs, which
* contain nulls as terminators.
*
* @param {Object} data client-provided data
* @param {UserRequest} request client request
* @param {Program} program active program
* @param {boolean} permit_null whether null values should be retained
*
* @return {Object} filtered data
*/
'public sanitizeDiff'( data, request, program, permit_null )
{
permit_null = ( permit_null === undefined ) ? false : permit_null;
if ( !request.getSession().isInternal() )
{
this._cleanInternals( data, program );
}
const types = program.meta.qtypes;
return this._filter.filter( data, types, {}, permit_null );
},
/**
* Strip internal fields from diff `data`
*
* Internal fields are defined by the program `program`.
*
* @param {Object} data bucket diff data
* @param {Program} program active program
*
* @return {undefined}
*/
'private _cleanInternals'( data, program )
{
for ( let id in program.internal )
{
delete data[ id ];
}
},
/**
* Trigger metadata Data API requests
*
* @param {DataApiManager} dapi_manager dapi manager
* @param {Program} program active program
* @param {Object} data client-provided data
* @param {Bucket} bucket active bucket
*
* @return {undefined}
*/
'private _triggerDapis'( dapi_manager, program, data, bucket )
{
const {
mapis = {},
meta: {
fields = {},
},
} = program;
const dapi_fields = this._determineDapiFields( mapis, data );
return Object.keys( dapi_fields ).map( field =>
{
const { dapi } = fields[ field ];
const indexes = dapi_fields[ field ];
return indexes.map( i =>
this._metaSource.getFieldData(
field,
i,
dapi_manager,
dapi,
this._mapDapiData( dapi, bucket, i, data )
)
);
} ).reduce( ( result, x ) => result.concat( x ), [] );
},
/**
* Determine which fields require a Data API to be triggered
*
* @param {Object} mapis metadata dapi descriptors
* @param {Object} data client-provided data
*
* @return {Object} fields with indexes in need of dapi calls
*/
'private _determineDapiFields'( mapis, data )
{
return Object.keys( mapis ).reduce(
( result, src_field ) =>
{
if ( data[ src_field ] === undefined )
{
return result;
}
const fields = mapis[ src_field ];
// get each index that changed
fields.forEach( field =>
{
result[ field ] = result[ field ] || [];
Object.keys( data[ src_field ] ).forEach( i =>
{
if ( data[ src_field ][ i ] !== undefined )
{
result[ field ][ i ] = i;
}
} );
} );
return result;
},
{}
);
},
/**
* Map data from bucket to dapi inputs
*
* @param {Object} dapi Data API descriptor
* @param {Bucket} bucket active (source) bucket
* @param {number} index field index
* @param {Object} diff_data client-provided data
*
* @return {Object} key/value dapi input data
*/
'private _mapDapiData'( dapi, bucket, index, diff_data )
{
const { mapsrc } = dapi;
return Object.keys( mapsrc ).reduce(
( result, srcid ) =>
{
const bucketid = mapsrc[ srcid ];
const bdata = ( diff_data[ bucketid ] || [] )[ index ] ||
( bucket.getDataByName( bucketid ) || [] )[ index ];
result[ srcid ] = bdata || [];
return result;
},
{}
);
},
} );

View File

@ -0,0 +1,69 @@
/**
* Instantiate appropriate DataApi
*
* Copyright (C) 2017 R-T Specialty, LLC.
*
* This file is part of the Liza Data Collection Framework.
*
* liza is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as
* published by the Free Software Foundation, either version 3 of the
* License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program. If not, see <http://www.gnu.org/licenses/>.
*/
const { Class } = require( 'easejs' );
const {
DataApiFactory,
http: {
NodeHttpImpl,
SpoofedNodeHttpImpl,
},
} = require( '../..' ).dapi;
/**
* Instantiates the appropriate DataApi object for the given service type
*/
module.exports = Class( 'ServerDataApiFactory' )
.extend( DataApiFactory,
{
/**
* Origin URL
* @type {string}
*/
'private _origin': '',
/**
* Request on behalf of user session
* @type {UserSession}
*/
'private _session': null,
constructor( origin, session )
{
this._origin = ''+origin;
this._session = session;
},
'override protected createHttpImpl'()
{
return NodeHttpImpl.use( SpoofedNodeHttpImpl( this._session ) )(
{
http: require( 'http' ),
https: require( 'https' ),
},
require( 'url' ),
this._origin
);
},
} );

View File

@ -551,6 +551,22 @@ module.exports = Class.extend( require( 'events' ).EventEmitter,
},
'public getHostAddr': function()
{
return this.request.headers['x-forwarded-host']
|| this.request.headers.host;
},
'public getOrigin': function()
{
const referrer = this.request.headers.referrer || "";
return this.request.headers.origin
|| ( referrer.match( '^[a-z]+://[^/]+' ) || [] )[ 0 ];
},
'public getUserAgent': function()
{
return this.request.headers['user-agent'];

View File

@ -110,14 +110,27 @@ describe( "DapiMetaSource", () =>
failc( e );
};
return Sut( () => getStubBucket() )
.getFieldData( 'name', 0, dapim, {}, {} )
.catch( given_e =>
{
expect( given_e ).to.equal( e );
return expect(
Sut( () => getStubBucket() )
.getFieldData( 'name', 0, dapim, {}, {} )
).to.eventually.be.rejectedWith( e );
} );
return true;
} );
it( "rejects if more than one result is returned from dapi", () =>
{
const dapim = createStubDapiManager();
dapim.getApiData = ( _, __, callback ) =>
{
// more than one result
callback( null, [ {}, {} ] );
};
return expect(
Sut( () => getStubBucket() )
.getFieldData( 'name', 0, dapim, {}, {} )
).to.eventually.be.rejectedWith( Error );
} );
} );

View File

@ -0,0 +1,307 @@
/**
* Manages DataAPI requests and return data
*
* Copyright (C) 2017 R-T Specialty, LLC.
*
* This file is part of the Liza Data Collection Framework.
*
* liza is free software: you can redistribute it and/or modify
* it under the terms of the GNU Affero General Public License as
* published by the Free Software Foundation, either version 3 of the
* License, or (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program. If not, see <http://www.gnu.org/licenses/>.
*/
'use strict';
const { Class } = require( 'easejs' );
const { expect } = require( 'chai' );
const Sut = require( '../../../' ).server.request.DataProcessor;
describe( 'DataProcessor', () =>
{
[
{
label: "strips internal field data when not internal",
data: {
internal: [ "foo", "bar" ],
foo: [ "bar", "baz" ],
},
internals: { internal: true },
internal: false,
expected: {
foo: [ "bar", "baz" ],
},
},
{
label: "keeps internal field data when internal",
data: {
internal: [ "foo", "bar" ],
foo: [ "bar", "baz" ],
},
internals: { internal: true },
internal: true,
expected: {
internal: [ "foo", "bar" ],
foo: [ "bar", "baz" ],
},
},
].forEach( ( { label, internal, data, internals = {}, expected } ) =>
{
const { request, program, sut } =
createSutFromStubs( internal, internals );
it( label, () =>
{
expect(
sut.processDiff( data, request, program ).filtered
).to.deep.equal( expected );
} );
} );
it( "passes data to bucket filter", () =>
{
const { request, program, meta_source } = createStubs();
const data = {};
const types = {};
program.meta.qtypes = types;
const filter = {
filter( given_data, given_types, given_ignore, given_null )
{
expect( given_data ).to.equal( data );
expect( given_types ).to.equal( types );
expect( given_null ).to.equal( false );
// not used
expect( given_ignore ).to.deep.equal( {} );
data.filtered = true;
}
};
Sut( filter, () => {}, meta_source )
.processDiff( data, request, program );
expect( data.filtered ).to.equal( true );
} );
it( "instantiates dapi manager using program and session", done =>
{
const { filter, request, program } = createStubs();
const dapi_factory = ( given_apis, given_request ) =>
{
expect( given_apis ).to.equal( program.apis );
expect( given_request ).to.equal( request );
done();
}
Sut( filter, dapi_factory )
.processDiff( {}, request, program );
} );
it( "invokes dapi manager when monitored bucket value changes", () =>
{
const triggered = {};
// g prefix = "given"
const getFieldData = function( gfield, gindex, gdapim, gdapi, gdata)
{
triggered[ gdapi.name ] = triggered[ gdapi.name ] || [];
triggered[ gdapi.name ][ gindex ] = arguments;
return Promise.resolve( true );
};
const dapi_manager = {};
const {
request,
program,
filter,
meta_source,
} = createStubs( false, {}, getFieldData );
const sut = Sut( filter, () => dapi_manager, meta_source );
program.meta.fields = {
foo: {
dapi: {
name: 'dapi_foo',
mapsrc: { ina: 'src', inb: 'src1' },
},
},
bar: {
dapi: {
name: 'dapi_bar',
mapsrc: { ina: 'src1' },
},
},
baz: {
dapi: {
name: 'dapi_no_call',
mapsrc: {},
},
},
};
program.mapis = {
src: [ 'foo', 'bar' ], // change
src1: [ 'foo' ], // change
src2: [ 'baz' ], // do not change
};
// data changed
const data = {
src: [ 'src0', 'src1' ],
src1: [ undefined, 'src11' ],
};
const bucket = createStubBucket( {
src: [ 'bsrc0', 'bsrc1' ],
src1: [ 'bsrc10', 'bsrc11' ],
} );
const { dapis } = sut.processDiff( data, request, program, bucket );
const expected = {
dapi_foo: [
{
name: 'foo',
data: {
ina: data.src[ 0 ],
inb: bucket.data.src1[ 0 ],
},
},
{
name: 'foo',
data: {
ina: data.src[ 1 ],
inb: data.src1[ 1 ],
},
},
],
dapi_bar: [
undefined,
{
name: 'bar',
data: {
ina: data.src1[ 1 ],
},
},
],
};
for ( let dapi_name in expected )
{
let expected_call = expected[ dapi_name ];
for ( let i in expected_call )
{
let chk = expected_call[ i ];
if ( chk === undefined )
{
continue;
}
let [ gfield, gindex, gdapi_manager, gdapi, gdata ] =
triggered[ dapi_name ][ i ];
expect( gfield ).to.equal( chk.name );
expect( gdapi.name ).to.equal( dapi_name );
expect( +gindex ).to.equal( +i );
expect( gdapi_manager ).to.equal( dapi_manager );
// see mapsrc
expect( gdata ).to.deep.equal( chk.data );
}
}
expect( triggered.dapi_no_call ).to.equal( undefined );
return Promise.all( dapis );
} );
} );
function createSutFromStubs( /* see createStubs */ )
{
const { request, program, filter, meta_source } =
createStubs.apply( null, arguments );
return {
request: request,
program: program,
filter: filter,
meta_source: meta_source,
sut: Sut( filter, () => {}, meta_source ),
};
}
function createStubs( internal, internals, getFieldData )
{
return {
request: createStubUserRequest( internal || false ),
program: createStubProgram( internals || {} ),
filter: { filter: _ => _ },
meta_source: createStubDapiMetaSource( getFieldData ),
};
}
function createStubUserRequest( internal )
{
return {
getSession: () => ( {
isInternal: () => internal
} )
};
}
function createStubProgram( internals )
{
return {
internal: internals,
meta: { qtypes: {}, fields: {} },
apis: {},
};
}
function createStubDapiMetaSource( getFieldData )
{
return {
getFieldData: getFieldData ||
function( field, index, dapi_manager, dapi, data ){},
};
}
function createStubBucket( data )
{
return {
data: data,
getDataByName( name )
{
return data[ name ];
},
};
}