Bug 1141296-make Loop use its own markup, not the SDK's, r=Standard8

Squashed commit of the following:

commit 708932504ff4b5cc219fdc0f021e750296214f03
Author: Dan Mosedale <dmose@meer.net>
    Remove debugging console spew

commit 93f3dbb5c0f1470cc21234ab2c765702e5307137
Author: Dan Mosedale <dmose@meer.net>
Date:   Mon Jun 8 10:36:36 2015 -0700

    Fix bad sizing of ConversationToolbar Examples

commit 7a4458a763ebfe2341414dea766fc3e982cc6d92
Author: Mark Banner <standard8@mozilla.com>
Date:   Mon Jun 8 11:34:28 2015 +0100

    Fix data channel setup to not send the signal back to the same client

commit d78dd64de2681545f874fc75615f601c2678ebca
Author: Mark Banner <standard8@mozilla.com>
Date:   Mon Jun 8 10:01:40 2015 +0100

    Remove the box-shadow for local desktop elements, per bug 1112021 - fix the ongoing conversation window display

commit 44c8176226ed5e0313fd70e781e0876ac22ba7cb
Author: Dan Mosedale <dmose@meer.net>
Date:   Fri Jun 5 14:08:35 2015 -0700

    Add jsdoc for makeActiveRoomStore

commit 665ad8cfd05c8f061278b3f2096d57f0a30477cb
Author: Dan Mosedale <dmose@meer.net>
Date:   Fri Jun 5 14:03:55 2015 -0700

    Cleaner way to skip a test so it shows up as pending

commit d97005a8d590cd70b14738605438da5d467fb6bc
Author: Dan Mosedale <dmose@meer.net>
Date:   Fri Jun 5 14:00:34 2015 -0700

    More jsdoc/XXX cleanup

commit db3e6a5fd717961bd2b88b881211e2515e42117a
Author: Dan Mosedale <dmose@meer.net>
Date:   Fri Jun 5 13:35:15 2015 -0700

    Rename _onRemoteSessionSubscribed to _setupDataChannelIfNeeded

commit 557f2e143b1942b4cbc02e5fd21047a556a20f71
Author: Dan Mosedale <dmose@meer.net>
Date:   Fri Jun 5 13:32:53 2015 -0700

    Handle and/or remove various XXX and YYY comments

commit 1e25b6d7d3a2916f3d52a40172fae64e12fa33e6
Author: Mark Banner <standard8@mozilla.com>
Date:   Fri Jun 5 18:08:40 2015 +0100

    Backout data channel disabling

commit 617e73d43bf7a129f1098b37a26dab358011ebb2
Author: Mark Banner <standard8@mozilla.com>
Date:   Fri Jun 5 17:58:26 2015 +0100

    Fix some more review comments

commit efb3a11b2ff6c667690af015c607c04e9a9b21b9
Author: Mark Banner <standard8@mozilla.com>
Date:   Fri Jun 5 13:34:32 2015 +0100

    Make joining a room with screenshare work reasonably well, until we can refactor the layout not to depend so much on element sizes/locations

commit fe1033fb7b16c13175d2ae4e65e28d7cd0465628
Author: Mark Banner <standard8@mozilla.com>
Date:   Fri Jun 5 12:20:32 2015 +0100

    Fix a strange quirk at 640px - @media min-width and max-width both match 640px (they are inclusive)

commit ee0e76759accb76e817dfec678a70d073e45d2ba
Author: Mark Banner <standard8@mozilla.com>
Date:   Fri Jun 5 12:03:06 2015 +0100

    Add some jsdoc comments and fix review nits

commit 5086f27cd252d517afaaa74529336d58078d0d2b
Author: Mark Banner <standard8@mozilla.com>
Date:   Fri Jun 5 11:27:24 2015 +0100

    Tidy up comments wrt mute in the otSdkDriver

commit 02a9e3c0724059af1db50ec930744756f90968aa
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu Jun 4 16:09:25 2015 -0700

    Remove LocalVideoDisabled action for now

commit 835012ecc4685d2538ce8de2bf03c04fc841d691
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu Jun 4 16:04:50 2015 -0700

    Fix commentary and remove extraneous string

commit a0daab47c321ed13f2c0693b80070892717b7134
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu Jun 4 16:00:58 2015 -0700

    Fix punctuation nit

commit d1c777ee4e049992e4759cfb2a756195d73e0db3
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu Jun 4 16:00:07 2015 -0700

    Remove duplicate setting of store state

commit e812494f3037810a3c7d08ca848e3bef8b5685f8
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu Jun 4 15:48:41 2015 -0700

    Indentation fix

commit b28f3d09e94769986674a9e38efc5e9605895e65
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu Jun 4 15:35:13 2015 -0700

    Add jsdoc with basic explanation of the React Frame component

commit 04e140aa878889d7966bc09886ec56a71fc40e1c
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu Jun 4 15:25:14 2015 -0700

    Fix indentation issues

commit 7376a564d953a2e9846762919b8c89eed1b290d7
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu Jun 4 15:20:33 2015 -0700

    Remove obsolete comment

commit a9d37c27f82bb6dbb853353bdb6a0336a19e67b1
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu Jun 4 14:16:11 2015 -0700

    Revert sdk.js logging changes

commit c6b2d58c87779d61deaf79f7e3dc7adf40688217
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu Jun 4 14:05:16 2015 -0700

    Add tests for StandaloneRoomView changes

commit ca14a76837c81d2579a2e83f5e7ff6bc50ecd479
Author: Mark Banner <standard8@mozilla.com>
Date:   Thu Jun 4 19:03:24 2015 +0100

    Remove remaining OT_ rules.

commit 90aa8ab45adebed4ec62e07a5a06c9cacf3810c5
Author: Mark Banner <standard8@mozilla.com>
Date:   Thu Jun 4 15:47:30 2015 +0100

    Add tests for DesktopRoomConversationView

commit a16601bfb2cc2bec10ad60939dbfc911681fcea6
Author: Mark Banner <standard8@mozilla.com>
Date:   Thu Jun 4 15:19:07 2015 +0100

    Add tests for the activeRoomStore

commit 8dd6f5ae82f991222c87b0dd1bd8830b37ef4d7d
Author: Mark Banner <standard8@mozilla.com>
Date:   Thu Jun 4 14:49:10 2015 +0100

    Add tests for OngoingConversationView

commit 56b00a3ee26a45379e530d3182c96a6829ecdb54
Author: Dan Mosedale <dmose@meer.net>
Date:   Wed Jun 3 15:35:07 2015 -0700

    Add tests for conversationStore.js and fix some possible leakage

commit 338e1d67811c773bbb051e5886eeda7e6eaa8421
Author: Dan Mosedale <dmose@meer.net>
Date:   Wed Jun 3 14:07:00 2015 -0700

    Remove obsolete setupStreamElements params

commit 16af6c3b282ce1f144724f18e6c6ad31c01a6ad3
Author: Dan Mosedale <dmose@meer.net>
Date:   Wed Jun 3 11:12:28 2015 -0700

    Fix functional tests for new DOM structure

commit 5c97758e82c5a31d213a406ff8b63cefe307634f
Author: Mark Banner <standard8@mozilla.com>
Date:   Wed Jun 3 17:44:24 2015 +0100

    Fix ui-showcase detection of errors

commit a421af9d48e66ab1c0f0b5da7d89122ed41d6a40
Author: Mark Banner <standard8@mozilla.com>
Date:   Wed Jun 3 13:49:57 2015 +0100

    Fix the location of the local video when remote video is muted or not being shown.

commit a095d9b3f9cdf418184428d13f954cb623932e4a
Author: Mark Banner <standard8@mozilla.com>
Date:   Wed Jun 3 13:05:01 2015 +0100

    Switch UI-Showcase to use a slightly bigger size for the current standalone views to properly reflect what desktop displays

commit 9a83958eebc788656c8d2002b5ea00b47be1bf53
Author: Mark Banner <standard8@mozilla.com>
Date:   Wed Jun 3 12:16:01 2015 +0100

    Remove the now unnecessary getLocal/Remote/ScreenShareElement functions

commit 4ecde728f92333b3203ffec2436c5e76cdb25a4e
Author: Mark Banner <standard8@mozilla.com>
Date:   Wed Jun 3 12:00:02 2015 +0100

    Fix screenshare video element size.

commit f44e57aa02a59960a7d4d7cfdf22bf952110b1cb
Author: Mark Banner <standard8@mozilla.com>
Date:   Wed Jun 3 11:53:58 2015 +0100

    Add mirror transform back to the local video

commit 87bb79f0c665bee2736af1c75868e973698fb21d
Author: Dan Mosedale <dmose@meer.net>
Date:   Tue Jun 2 15:12:28 2015 -0700

    otSdkDriver element cleanup and automated tests

commit 699f7c05a9b5be7e1790d60f33a1444331e1399e
Author: Mark Banner <standard8@mozilla.com>
Date:   Tue Jun 2 18:09:42 2015 +0100

    WIP Move media element handling to a shared view to reduce code duplication.

commit d8d8a5435134f1003f6e290fe4b4f838208d42f2
Author: Dan Mosedale <dmose@meer.net>
Date:   Sun May 31 17:58:52 2015 -0700

    WIP port remaining screenshare magic to view

commit 01446056a5f2459f787b7e6fbe0a74fd979ac651
Author: Dan Mosedale <dmose@meer.net>
Date:   Fri May 29 11:42:52 2015 -0700

    WIP fix desktop facemuting by porting most video magic from driver to view

commit 41c3efed44e3216eb4563b311442c3694167b877
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu May 28 15:59:38 2015 -0700

    WIP standalone facemute refactor, part2; needs tests & cleanup

commit fdd027e900951b61da30746c569d18ab61a4737b
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu May 28 13:31:30 2015 -0700

    Renamed to this.mockPublisherEl for consistency

commit a9d351d8e06719d63d4f200c00d9ae335019b0f8
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu May 28 13:28:40 2015 -0700

    Renamed .screen-share__video to .screen-share-video

commit 14f5bc59afb6c90a857ddfee391a4c71d7bbdf3b
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu May 28 13:24:57 2015 -0700

    Rename .remote__video to .remote-video

commit 17e61b1c111c600c3703eff9c15b320279687313
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu May 28 13:14:03 2015 -0700

    WIP standalone room views refactor for facemuting; semi-broken

commit 59447a60ae45c332c1f1fb452ce762da01cafbb5
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu May 28 13:09:22 2015 -0700

    Fix more merge bustage & spelling

commit 82d1d0d87c525ffe2b962d0ef7f543ade06d0376
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu May 28 09:17:57 2015 -0700

    Fix eslint nits

commit f743e602e0a19f7fcff6b0622fe12f87b43efa84
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu May 28 09:11:21 2015 -0700

    Rename remote__video element id to remote-video

commit b69d686b7812dfa7d631989b64ffae1e305c6c06
Author: Dan Mosedale <dmose@meer.net>
Date:   Wed May 27 13:34:40 2015 -0700

    Fix merge error by switching ready to FramedExample

commit d8c9846ddac0967adb10793b561895a7aef085c4
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu May 14 11:23:21 2015 -0700

    WIP Add direct call video mute showcase views + impl facemute

commit 9dbf5f5f403d088b14f68935577093e2e96409a2
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu May 7 10:52:55 2015 -0700

    Remove soon-to-be-unsupported standalone call URL views

commit d2bf428dc6ce56b878e21525525a62b9254bbdeb
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu May 7 10:51:39 2015 -0700

    Switch desktop conversation with local face mute to FramedExample

commit 48c825d6cc916f0d9384db42252a86246c313a49
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu May 7 10:09:19 2015 -0700

    Switch large OngoingConversationWindow to FramedExample

commit f72aefc7cca1014f1accdafc2bc21f29c1b0bac9
Author: Dan Mosedale <dmose@meer.net>
Date:   Tue May 5 16:30:48 2015 -0700

    Get OngoingCallView and first showcase use working

commit fb31b510738f7badf9094c600ede92f18df10763
Author: Dan Mosedale <dmose@meer.net>
Date:   Tue May 5 16:17:24 2015 -0700

    Switch last 4 standalone views to FramedExample

commit 7815eaadd157f6d210ad9bd43bbbbba114995a51
Author: Dan Mosedale <dmose@meer.net>
Date:   Mon May 4 14:56:02 2015 -0700

    Switch DesktopRoomConversationViews over to FramedExample

commit 3a70e38e78cce4afc34d934526b0eff2aefc1ea2
Author: Dan Mosedale <dmose@meer.net>
Date:   Wed Apr 29 16:36:33 2015 -0700

    Bulletproof forcedUpdate some more

commit 75b5f8c56848983bac313d3c8ee166e3f25026ef
Author: Dan Mosedale <dmose@meer.net>
Date:   Mon May 11 14:09:17 2015 -0700

    Workaround ui-showcase race by waiting longer

commit d67c6b113c88f0e8543fe8fe2d9c174c74ad6942
Author: Dan Mosedale <dmose@meer.net>
Date:   Wed Apr 29 16:30:44 2015 -0700

    Better error-handling in react-frame-component

commit 61e644f0bd0db8e65963a603d94d76cdc96c1bd6
Author: Dan Mosedale <dmose@meer.net>
Date:   Wed Apr 29 12:34:17 2015 -0700

    Convert standalone joined room to FramedExample

commit b40db913af5059869898a6db2873f0493037f1cd
Author: Dan Mosedale <dmose@meer.net>
Date:   Wed Apr 29 12:22:14 2015 -0700

    Fix grey video letter/pillarboxing regression

commit 451b9a73552d7158f68d4217368d4cc06c47c7ab
Author: Dan Mosedale <dmose@meer.net>
Date:   Tue Apr 28 13:54:12 2015 -0700

    Tweak makeActiveRoomStore to take options object and require roomState; adjust clients

commit adc8e96ed5266b79496d029238cf570dab20f28a
Author: Dan Mosedale <dmose@meer.net>
Date:   Tue Apr 28 12:43:32 2015 -0700

    Make StandAloneRoomView (ready) FramedExample

commit 0a5a47c6bc709ad0eb93b2517e6905babe404fb0
Author: Dan Mosedale <dmose@meer.net>
Date:   Tue Apr 28 11:17:19 2015 -0700

    Refactor activeRoomStore updating in showcase

commit e315b311e51dabb747ba3a298b1573f3eaa0ebff
Author: Dan Mosedale <dmose@meer.net>
Date:   Tue Apr 28 11:15:08 2015 -0700

    Add matchMedia hack to activeRoomStore for showcase use

commit 9049d32791cb32c53fd3cfcf91e96f02e6b26cf8
Author: Dan Mosedale <dmose@meer.net>
Date:   Mon Apr 27 16:36:46 2015 -0700

    Make Frame pass iframe contentWindow to onContentsRendered

commit 302912ffcfc01548b713d829bc1025957b75a52d
Author: Dan Mosedale <dmose@meer.net>
Date:   Mon Apr 27 16:10:32 2015 -0700

    Get rid of unnecessary showcase-specific rules

commit 9014e1d58607c59d70ef3bab415a747ce5366985
Author: Dan Mosedale <dmose@meer.net>
Date:   Mon Apr 27 11:37:45 2015 -0700

    Resize showcase posters to 640x480 for easy verification

commit 6298d044bf00c88566e9bf79ecd184715cf71b7d
Author: Dan Mosedale <dmose@meer.net>
Date:   Wed Apr 8 13:54:48 2015 -0700

    Switch screensharing view to FramedExample, clean up sizing.

commit ca7cb3930b91b1ff1531c0fc7772ff8ef88bf3a4
Author: Dan Mosedale <dmose@meer.net>
Date:   Fri Apr 3 15:18:21 2015 -0700

    Switch screensharing over; add (WIP) showcase view

commit 218e9071357c0c7c8d2f8f0a2c2c93c40409122e
Author: Dan Mosedale <dmose@meer.net>
Date:   Mon Mar 30 15:25:55 2015 -0700

    Stop trying to adjust styles on an SDK-provided div we no longer have

commit 3c274303caa4ad49abfbee95e74b17242e1e099b
Author: Dan Mosedale <dmose@meer.net>
Date:   Mon Mar 30 15:24:24 2015 -0700

    Switch StandaloneRoomView w/people to iframe; ditch unneeded CSS

commit 1b661d08d29d6b47849ff3b2616310fba71b518a
Author: Dan Mosedale <dmose@meer.net>
Date:   Thu Mar 26 12:12:33 2015 -0700

    Give StandaloneRoomView posterUrl args for ui-showcase

commit d0ab0ee2798516d06d5bac2f9b383f9842c8e7e3
Author: Dan Mosedale <dmose@meer.net>
Date:   Wed Mar 25 15:08:55 2015 -0700

    Do basic CSS fixup to changed structure

commit 02e71d9352d42d1ebde386fe559288e3a9bd1335
Author: Dan Mosedale <dmose@meer.net>
Date:   Fri Mar 20 07:13:44 2015 -0700

    Get local & remote cameras working; fix tests
This commit is contained in:
Dan Mosedale 2015-06-08 10:54:49 -07:00
parent f7a51e9198
commit c509a2a687
31 changed files with 2607 additions and 928 deletions

View File

@ -565,13 +565,23 @@ loop.conversationViews = (function(mozL10n) {
var OngoingConversationView = React.createClass({displayName: "OngoingConversationView",
mixins: [
loop.store.StoreMixin("conversationStore"),
sharedMixins.MediaSetupMixin
],
propTypes: {
dispatcher: React.PropTypes.instanceOf(loop.Dispatcher).isRequired,
// local
video: React.PropTypes.object,
audio: React.PropTypes.object
// local
audio: React.PropTypes.object,
remoteVideoEnabled: React.PropTypes.bool,
// This is used from the props rather than the state to make it easier for
// the ui-showcase.
mediaConnected: React.PropTypes.bool,
// The poster URLs are for UI-showcase testing and development.
localPosterUrl: React.PropTypes.string,
remotePosterUrl: React.PropTypes.string
},
getDefaultProps: function() {
@ -581,6 +591,10 @@ loop.conversationViews = (function(mozL10n) {
};
},
getInitialState: function() {
return this.getStoreState();
},
componentDidMount: function() {
// The SDK needs to know about the configuration and the elements to use
// for display. So the best way seems to pass the information here - ideally
@ -588,9 +602,7 @@ loop.conversationViews = (function(mozL10n) {
this.props.dispatcher.dispatch(new sharedActions.SetupStreamElements({
publisherConfig: this.getDefaultPublisherConfig({
publishVideo: this.props.video.enabled
}),
getLocalElementFunc: this._getElement.bind(this, ".local"),
getRemoteElementFunc: this._getElement.bind(this, ".remote")
})
}));
},
@ -616,6 +628,18 @@ loop.conversationViews = (function(mozL10n) {
}));
},
shouldRenderRemoteVideo: function() {
if (this.props.mediaConnected) {
// If remote video is not enabled, we're muted, so we'll show an avatar
// instead.
return this.props.remoteVideoEnabled;
}
// We're not yet connected, but we don't want to show the avatar, and in
// the common case, we'll just transition to the video.
return true;
},
render: function() {
var localStreamClasses = React.addons.classSet({
local: true,
@ -628,11 +652,22 @@ loop.conversationViews = (function(mozL10n) {
React.createElement("div", {className: "conversation"},
React.createElement("div", {className: "media nested"},
React.createElement("div", {className: "video_wrapper remote_wrapper"},
React.createElement("div", {className: "video_inner remote focus-stream"})
React.createElement("div", {className: "video_inner remote focus-stream"},
React.createElement(sharedViews.MediaView, {displayAvatar: !this.shouldRenderRemoteVideo(),
posterUrl: this.props.remotePosterUrl,
mediaType: "remote",
srcVideoObject: this.state.remoteSrcVideoObject})
)
),
React.createElement("div", {className: localStreamClasses})
React.createElement("div", {className: localStreamClasses},
React.createElement(sharedViews.MediaView, {displayAvatar: !this.props.video.enabled,
posterUrl: this.props.localPosterUrl,
mediaType: "local",
srcVideoObject: this.state.localSrcVideoObject})
)
),
React.createElement(loop.shared.views.ConversationToolbar, {
dispatcher: this.props.dispatcher,
video: this.props.video,
audio: this.props.audio,
publishStream: this.publishStream,
@ -742,7 +777,10 @@ loop.conversationViews = (function(mozL10n) {
return (React.createElement(OngoingConversationView, {
dispatcher: this.props.dispatcher,
video: {enabled: !this.state.videoMuted},
audio: {enabled: !this.state.audioMuted}}
audio: {enabled: !this.state.audioMuted},
remoteVideoEnabled: this.state.remoteVideoEnabled,
mediaConnected: this.state.mediaConnected,
remoteSrcVideoObject: this.state.remoteSrcVideoObject}
)
);
}

View File

@ -565,13 +565,23 @@ loop.conversationViews = (function(mozL10n) {
var OngoingConversationView = React.createClass({
mixins: [
loop.store.StoreMixin("conversationStore"),
sharedMixins.MediaSetupMixin
],
propTypes: {
dispatcher: React.PropTypes.instanceOf(loop.Dispatcher).isRequired,
// local
video: React.PropTypes.object,
audio: React.PropTypes.object
// local
audio: React.PropTypes.object,
remoteVideoEnabled: React.PropTypes.bool,
// This is used from the props rather than the state to make it easier for
// the ui-showcase.
mediaConnected: React.PropTypes.bool,
// The poster URLs are for UI-showcase testing and development.
localPosterUrl: React.PropTypes.string,
remotePosterUrl: React.PropTypes.string
},
getDefaultProps: function() {
@ -581,6 +591,10 @@ loop.conversationViews = (function(mozL10n) {
};
},
getInitialState: function() {
return this.getStoreState();
},
componentDidMount: function() {
// The SDK needs to know about the configuration and the elements to use
// for display. So the best way seems to pass the information here - ideally
@ -588,9 +602,7 @@ loop.conversationViews = (function(mozL10n) {
this.props.dispatcher.dispatch(new sharedActions.SetupStreamElements({
publisherConfig: this.getDefaultPublisherConfig({
publishVideo: this.props.video.enabled
}),
getLocalElementFunc: this._getElement.bind(this, ".local"),
getRemoteElementFunc: this._getElement.bind(this, ".remote")
})
}));
},
@ -616,6 +628,18 @@ loop.conversationViews = (function(mozL10n) {
}));
},
shouldRenderRemoteVideo: function() {
if (this.props.mediaConnected) {
// If remote video is not enabled, we're muted, so we'll show an avatar
// instead.
return this.props.remoteVideoEnabled;
}
// We're not yet connected, but we don't want to show the avatar, and in
// the common case, we'll just transition to the video.
return true;
},
render: function() {
var localStreamClasses = React.addons.classSet({
local: true,
@ -628,11 +652,22 @@ loop.conversationViews = (function(mozL10n) {
<div className="conversation">
<div className="media nested">
<div className="video_wrapper remote_wrapper">
<div className="video_inner remote focus-stream"></div>
<div className="video_inner remote focus-stream">
<sharedViews.MediaView displayAvatar={!this.shouldRenderRemoteVideo()}
posterUrl={this.props.remotePosterUrl}
mediaType="remote"
srcVideoObject={this.state.remoteSrcVideoObject} />
</div>
</div>
<div className={localStreamClasses}>
<sharedViews.MediaView displayAvatar={!this.props.video.enabled}
posterUrl={this.props.localPosterUrl}
mediaType="local"
srcVideoObject={this.state.localSrcVideoObject} />
</div>
<div className={localStreamClasses}></div>
</div>
<loop.shared.views.ConversationToolbar
dispatcher={this.props.dispatcher}
video={this.props.video}
audio={this.props.audio}
publishStream={this.publishStream}
@ -743,6 +778,9 @@ loop.conversationViews = (function(mozL10n) {
dispatcher={this.props.dispatcher}
video={{enabled: !this.state.videoMuted}}
audio={{enabled: !this.state.audioMuted}}
remoteVideoEnabled={this.state.remoteVideoEnabled}
mediaConnected={this.state.mediaConnected}
remoteSrcVideoObject={this.state.remoteSrcVideoObject}
/>
);
}

View File

@ -579,7 +579,10 @@ loop.roomViews = (function(mozL10n) {
propTypes: {
dispatcher: React.PropTypes.instanceOf(loop.Dispatcher).isRequired,
mozLoop: React.PropTypes.object.isRequired
mozLoop: React.PropTypes.object.isRequired,
// The poster URLs are for UI-showcase testing and development.
localPosterUrl: React.PropTypes.string,
remotePosterUrl: React.PropTypes.string
},
componentWillUpdate: function(nextProps, nextState) {
@ -591,10 +594,7 @@ loop.roomViews = (function(mozL10n) {
this.props.dispatcher.dispatch(new sharedActions.SetupStreamElements({
publisherConfig: this.getDefaultPublisherConfig({
publishVideo: !this.state.videoMuted
}),
getLocalElementFunc: this._getElement.bind(this, ".local"),
getScreenShareElementFunc: this._getElement.bind(this, ".screen"),
getRemoteElementFunc: this._getElement.bind(this, ".remote")
})
}));
}
},
@ -635,6 +635,40 @@ loop.roomViews = (function(mozL10n) {
);
},
/**
* Works out if remote video should be rended or not, depending on the
* room state and other flags.
*
* @return {Boolean} True if remote video should be rended.
*/
shouldRenderRemoteVideo: function() {
switch(this.state.roomState) {
case ROOM_STATES.HAS_PARTICIPANTS:
if (this.state.remoteVideoEnabled) {
return true;
}
if (this.state.mediaConnected) {
// since the remoteVideo hasn't yet been enabled, if the
// media is connected, then we should be displaying an avatar.
return false;
}
return true;
case ROOM_STATES.SESSION_CONNECTED:
case ROOM_STATES.JOINED:
// this case is so that we don't show an avatar while waiting for
// the other party to connect
return true;
default:
console.warn("StandaloneRoomView.shouldRenderRemoteVideo:" +
" unexpected roomState: ", this.state.roomState);
return true;
}
},
render: function() {
if (this.state.roomName) {
this.setTitle(this.state.roomName);
@ -674,6 +708,7 @@ loop.roomViews = (function(mozL10n) {
);
}
default: {
return (
React.createElement("div", {className: "room-conversation-wrapper"},
React.createElement(sharedViews.TextChatView, {dispatcher: this.props.dispatcher}),
@ -690,10 +725,19 @@ loop.roomViews = (function(mozL10n) {
React.createElement("div", {className: "conversation room-conversation"},
React.createElement("div", {className: "media nested"},
React.createElement("div", {className: "video_wrapper remote_wrapper"},
React.createElement("div", {className: "video_inner remote focus-stream"})
React.createElement("div", {className: "video_inner remote focus-stream"},
React.createElement(sharedViews.MediaView, {displayAvatar: !this.shouldRenderRemoteVideo(),
posterUrl: this.props.remotePosterUrl,
mediaType: "remote",
srcVideoObject: this.state.remoteSrcVideoObject})
)
),
React.createElement("div", {className: localStreamClasses}),
React.createElement("div", {className: "screen hide"})
React.createElement("div", {className: localStreamClasses},
React.createElement(sharedViews.MediaView, {displayAvatar: this.state.videoMuted,
posterUrl: this.props.localPosterUrl,
mediaType: "local",
srcVideoObject: this.state.localSrcVideoObject})
)
),
React.createElement(sharedViews.ConversationToolbar, {
dispatcher: this.props.dispatcher,

View File

@ -579,7 +579,10 @@ loop.roomViews = (function(mozL10n) {
propTypes: {
dispatcher: React.PropTypes.instanceOf(loop.Dispatcher).isRequired,
mozLoop: React.PropTypes.object.isRequired
mozLoop: React.PropTypes.object.isRequired,
// The poster URLs are for UI-showcase testing and development.
localPosterUrl: React.PropTypes.string,
remotePosterUrl: React.PropTypes.string
},
componentWillUpdate: function(nextProps, nextState) {
@ -591,10 +594,7 @@ loop.roomViews = (function(mozL10n) {
this.props.dispatcher.dispatch(new sharedActions.SetupStreamElements({
publisherConfig: this.getDefaultPublisherConfig({
publishVideo: !this.state.videoMuted
}),
getLocalElementFunc: this._getElement.bind(this, ".local"),
getScreenShareElementFunc: this._getElement.bind(this, ".screen"),
getRemoteElementFunc: this._getElement.bind(this, ".remote")
})
}));
}
},
@ -635,6 +635,40 @@ loop.roomViews = (function(mozL10n) {
);
},
/**
* Works out if remote video should be rended or not, depending on the
* room state and other flags.
*
* @return {Boolean} True if remote video should be rended.
*/
shouldRenderRemoteVideo: function() {
switch(this.state.roomState) {
case ROOM_STATES.HAS_PARTICIPANTS:
if (this.state.remoteVideoEnabled) {
return true;
}
if (this.state.mediaConnected) {
// since the remoteVideo hasn't yet been enabled, if the
// media is connected, then we should be displaying an avatar.
return false;
}
return true;
case ROOM_STATES.SESSION_CONNECTED:
case ROOM_STATES.JOINED:
// this case is so that we don't show an avatar while waiting for
// the other party to connect
return true;
default:
console.warn("StandaloneRoomView.shouldRenderRemoteVideo:" +
" unexpected roomState: ", this.state.roomState);
return true;
}
},
render: function() {
if (this.state.roomName) {
this.setTitle(this.state.roomName);
@ -674,6 +708,7 @@ loop.roomViews = (function(mozL10n) {
);
}
default: {
return (
<div className="room-conversation-wrapper">
<sharedViews.TextChatView dispatcher={this.props.dispatcher} />
@ -690,10 +725,19 @@ loop.roomViews = (function(mozL10n) {
<div className="conversation room-conversation">
<div className="media nested">
<div className="video_wrapper remote_wrapper">
<div className="video_inner remote focus-stream"></div>
<div className="video_inner remote focus-stream">
<sharedViews.MediaView displayAvatar={!this.shouldRenderRemoteVideo()}
posterUrl={this.props.remotePosterUrl}
mediaType="remote"
srcVideoObject={this.state.remoteSrcVideoObject} />
</div>
</div>
<div className={localStreamClasses}>
<sharedViews.MediaView displayAvatar={this.state.videoMuted}
posterUrl={this.props.localPosterUrl}
mediaType="local"
srcVideoObject={this.state.localSrcVideoObject} />
</div>
<div className={localStreamClasses}></div>
<div className="screen hide"></div>
</div>
<sharedViews.ConversationToolbar
dispatcher={this.props.dispatcher}

View File

@ -254,6 +254,12 @@
left: 0px;
}
.fx-embedded .no-video {
background: black none repeat scroll 0% 0%;
height: 100%;
width: 100%;
}
.standalone .local-stream,
.standalone .remote-inset-stream {
/* required to have it superimposed to the control toolbar */
@ -512,11 +518,6 @@
width: 30%;
height: 28%;
max-height: 105px;
box-shadow: 0px 2px 4px rgba(0,0,0,.5);
}
.fx-embedded .room-conversation .local-stream {
box-shadow: none;
}
.fx-embedded .local-stream.room-preview {
@ -540,73 +541,32 @@
right: 0;
}
/*
* XXX this approach is fragile because it makes assumptions
* about the generated OT markup, any change will break it
*/
/*
* For any audio-only streams, we want to display our own background
*/
.OT_audio-only .OT_widget-container .OT_video-poster {
.avatar {
background-image: url("../img/audio-call-avatar.svg");
background-repeat: no-repeat;
background-color: #4BA6E7;
background-size: contain;
background-position: center;
/*
* Expand to fill the available space, since there is no video any
* intrinsic width. XXX should really change to an <img> for clarity
*/
height: 100%;
width: 100%;
}
/*
* Audio-only. For local streams, cancel out the SDK's opacity of 0.25.
* For remote streams we leave them shaded, as otherwise its too bright.
*/
.local-stream-audio .OT_publisher .OT_video-poster {
opacity: 1
.local .avatar {
position: absolute;
z-index: 1;
}
/*
* In audio-only mode, don't display the video element, doing so interferes
* with the background opacity of the video-poster element.
*/
.OT_audio-only .OT_widget-container .OT_video-element {
display: none;
}
/*
* Ensure that the publisher (i.e. local) video is never cropped, so that it's
* not possible for someone to be presented with a picture that displays
* (for example) a person from the neck up, even though the camera is capturing
* and transmitting a picture of that person from the waist up.
*
* The !importants are necessary to override the SDK attempts to avoid
* letterboxing entirely.
*
* If we could easily use test video streams with the SDK (eg if initPublisher
* supported something like a "testMediaToStreamURI" parameter that it would
* use to source the stream rather than the output of gUM, it wouldn't be too
* hard to generate a video with a 1 pixel border at the edges that one could
* at least visually see wasn't being cropped.
*
* Another less ugly possibility would be to work with Ted Mielczarek to use
* the fake camera drivers he has for Linux.
*/
.room-conversation .OT_publisher .OT_widget-container {
height: 100% !important;
width: 100% !important;
top: 0 !important;
left: 0 !important;
background-color: transparent; /* avoid visually obvious letterboxing */
}
.room-conversation .OT_publisher .OT_widget-container video {
background-color: transparent; /* avoid visually obvious letterboxing */
}
.fx-embedded .room-conversation .room-preview .OT_publisher .OT_widget-container,
.fx-embedded .room-conversation .room-preview .OT_publisher .OT_widget-container video {
/* Desktop conversation window room preview local stream actually wants
a black background */
background-color: #000;
.remote .avatar {
/* make visually distinct from local avatar */
opacity: 0.25;
}
.fx-embedded .media.nested {
@ -712,7 +672,8 @@ html, .fx-embedded, #main,
margin: auto;
}
@media screen and (min-width:640px) {
/* We use 641px rather than 640, as min-width and max-width are inclusive */
@media screen and (min-width:641px) {
.standalone .conversation-toolbar {
position: absolute;
bottom: 0;
@ -766,11 +727,6 @@ html, .fx-embedded, #main,
height: 90%;
}
.standalone .OT_subscriber {
height: 100%;
width: auto;
}
.standalone .media.nested {
min-height: 500px;
}
@ -798,7 +754,7 @@ html, .fx-embedded, #main,
.standalone .video_wrapper.remote_wrapper {
/* Because of OT markup we need to set a high flex value
* Flex rule assures remote and local streams stack on top of eachother
* Flex rule assures remote and local streams stack on top of each other
* Computed width is not 100% unless the `width` rule */
flex: 2;
width: 100%;
@ -1278,7 +1234,7 @@ body[dir=rtl] .room-context-btn-edit {
.standalone .room-conversation .video_wrapper.remote_wrapper {
background-color: #4e4e4e;
width: 75%;
width: calc(75% - 10px); /* Take the left margin into account. */
}
.standalone .room-conversation .conversation-toolbar {
@ -1401,7 +1357,7 @@ body[dir=rtl] .room-context-btn-edit {
@media screen and (max-height:160px) {
/* disable the self view */
.standalone .OT_publisher {
.standalone .local-video {
display: none;
}
@ -1412,3 +1368,36 @@ body[dir=rtl] .room-context-btn-edit {
top: 90px;
}
}
.remote-video {
/* Since there is grey stuff behind us, avoid obvious letterboxing, only do
* this on remote video as local video has transparent background.
*/
background-color: black;
}
.standalone .screen.focus-stream {
/* Since there is grey stuff behind us, avoid obvious letterboxing */
background-color: black;
}
.local-video {
width: 100%;
height: 100%;
/* Transform is to make the local video act like a mirror, as is the
convention in video conferencing systems. */
transform: scale(-1, 1);
transform-origin: 50% 50% 0;
}
.remote-video {
width: 100%;
height: 100%;
display: block;
position: absolute;
}
.screen-share-video {
width: 100%;
height: 100%;
}

View File

@ -193,14 +193,7 @@ loop.shared.actions = (function() {
*/
SetupStreamElements: Action.define("setupStreamElements", {
// The configuration for the publisher/subscribe options
publisherConfig: Object,
// The local stream element
getLocalElementFunc: Function,
// The screen share element; optional until all conversation
// types support it.
// getScreenShareElementFunc: Function,
// The remote stream element
getRemoteElementFunc: Function
publisherConfig: Object
}),
/**
@ -225,6 +218,42 @@ loop.shared.actions = (function() {
dimensions: Object
}),
/**
* Video has been enabled from the remote sender.
*
* XXX somewhat tangled up with remote video muting semantics; see bug
* 1171969
*
* @note if/when we want to untangle this, we'll may want to include the
* reason provided by the SDK and documented hereI:
* https://tokbox.com/opentok/libraries/client/js/reference/VideoEnabledChangedEvent.html
*/
RemoteVideoEnabled: Action.define("remoteVideoEnabled", {
/* The SDK video object that the views will be copying the remote
stream from. */
srcVideoObject: Object
}),
/**
* Video has been disabled by the remote sender.
*
* @see RemoteVideoEnabled
*/
RemoteVideoDisabled: Action.define("remoteVideoDisabled", {
}),
/**
* Video from the local camera has been enabled.
*
* XXX we should implement a LocalVideoDisabled action to cleanly prevent
* leakage; see bug 1171978 for details
*/
LocalVideoEnabled: Action.define("localVideoEnabled", {
/* The SDK video object that the views will be copying the remote
stream from. */
srcVideoObject: Object
}),
/**
* Used to mute or unmute a stream
*/
@ -250,7 +279,7 @@ loop.shared.actions = (function() {
}),
/**
* Used to notifiy that screen sharing is active or not.
* Used to notify that screen sharing is active or not.
*/
ScreenSharingState: Action.define("screenSharingState", {
// One of loop.shared.utils.SCREEN_SHARE_STATES.
@ -259,9 +288,13 @@ loop.shared.actions = (function() {
/**
* Used to notify that a shared screen is being received (or not).
*
* XXX this is going to need to be split into two actions so when
* can display a spinner when the screen share is pending (bug 1171933)
*/
ReceivingScreenShare: Action.define("receivingScreenShare", {
receiving: Boolean
// srcVideoObject: Object (only present if receiving is true)
}),
/**

View File

@ -77,10 +77,15 @@ loop.store.ActiveRoomStore = (function() {
*/
_statesToResetOnLeave: [
"audioMuted",
"localSrcVideoObject",
"localVideoDimensions",
"mediaConnected",
"receivingScreenShare",
"remoteSrcVideoObject",
"remoteVideoDimensions",
"remoteVideoEnabled",
"screenSharingState",
"screenShareVideoObject",
"videoMuted"
],
@ -95,6 +100,7 @@ loop.store.ActiveRoomStore = (function() {
roomState: ROOM_STATES.INIT,
audioMuted: false,
videoMuted: false,
remoteVideoEnabled: false,
failureReason: undefined,
// Tracks if the room has been used during this
// session. 'Used' means at least one call has been placed
@ -115,7 +121,10 @@ loop.store.ActiveRoomStore = (function() {
roomInfoFailure: null,
// The name of the room.
roomName: null,
socialShareProviders: null
// Social API state.
socialShareProviders: null,
// True if media has been connected both-ways.
mediaConnected: false
};
},
@ -169,11 +178,15 @@ loop.store.ActiveRoomStore = (function() {
"windowUnload",
"leaveRoom",
"feedbackComplete",
"localVideoEnabled",
"remoteVideoEnabled",
"remoteVideoDisabled",
"videoDimensionsChanged",
"startScreenShare",
"endScreenShare",
"updateSocialShareInfo",
"connectionStatus"
"connectionStatus",
"mediaConnected"
]);
},
@ -550,6 +563,41 @@ loop.store.ActiveRoomStore = (function() {
this.setStoreState(muteState);
},
/**
* Records the local video object for the room.
*
* @param {sharedActions.LocalVideoEnabled} actionData
*/
localVideoEnabled: function(actionData) {
this.setStoreState({localSrcVideoObject: actionData.srcVideoObject});
},
/**
* Records the remote video object for the room.
*
* @param {sharedActions.RemoteVideoEnabled} actionData
*/
remoteVideoEnabled: function(actionData) {
this.setStoreState({
remoteVideoEnabled: true,
remoteSrcVideoObject: actionData.srcVideoObject
});
},
/**
* Records when remote video is disabled (e.g. due to mute).
*/
remoteVideoDisabled: function() {
this.setStoreState({remoteVideoEnabled: false});
},
/**
* Records when the remote media has been connected.
*/
mediaConnected: function() {
this.setStoreState({mediaConnected: true});
},
/**
* Used to note the current screensharing state.
*/
@ -563,6 +611,9 @@ loop.store.ActiveRoomStore = (function() {
/**
* Used to note the current state of receiving screenshare data.
*
* XXX this is going to need to be split into two actions so when
* can display a spinner when the screen share is pending (bug 1171933)
*/
receivingScreenShare: function(actionData) {
if (!actionData.receiving &&
@ -573,10 +624,15 @@ loop.store.ActiveRoomStore = (function() {
delete newDimensions.screen;
this.setStoreState({
receivingScreenShare: actionData.receiving,
remoteVideoDimensions: newDimensions
remoteVideoDimensions: newDimensions,
screenShareVideoObject: null
});
} else {
this.setStoreState({receivingScreenShare: actionData.receiving});
this.setStoreState({
receivingScreenShare: actionData.receiving,
screenShareVideoObject: actionData.srcVideoObject ?
actionData.srcVideoObject : null
});
}
},
@ -676,7 +732,10 @@ loop.store.ActiveRoomStore = (function() {
* one participantleaves.
*/
remotePeerDisconnected: function() {
this.setStoreState({roomState: ROOM_STATES.SESSION_CONNECTED});
this.setStoreState({
roomState: ROOM_STATES.SESSION_CONNECTED,
remoteSrcVideoObject: null
});
},
/**

View File

@ -93,6 +93,8 @@ loop.store = loop.store || {};
callId: undefined,
// The caller id of the contacting side
callerId: undefined,
// True if media has been connected both-ways.
mediaConnected: false,
// The connection progress url to connect the websocket
progressURL: undefined,
// The websocket token that allows connection to the progress url
@ -103,10 +105,11 @@ loop.store = loop.store || {};
sessionId: undefined,
// SDK session token
sessionToken: undefined,
// If the audio is muted
// If the local audio is muted
audioMuted: false,
// If the video is muted
videoMuted: false
// If the local video is muted
videoMuted: false,
remoteVideoEnabled: false
};
},
@ -232,6 +235,9 @@ loop.store = loop.store || {};
"mediaConnected",
"setMute",
"fetchRoomEmailLink",
"localVideoEnabled",
"remoteVideoDisabled",
"remoteVideoEnabled",
"windowUnload"
]);
@ -408,6 +414,7 @@ loop.store = loop.store || {};
*/
mediaConnected: function() {
this._websocket.mediaUp();
this.setStoreState({mediaConnected: true});
},
/**
@ -440,6 +447,44 @@ loop.store = loop.store || {};
}.bind(this));
},
/**
* Handles when the remote stream has been enabled and is supplied.
*
* @param {sharedActions.RemoteVideoEnabled} actionData
*/
remoteVideoEnabled: function(actionData) {
this.setStoreState({
remoteVideoEnabled: true,
remoteSrcVideoObject: actionData.srcVideoObject
});
},
/**
* Handles when the remote stream has been disabled, e.g. due to video mute.
*
* @param {sharedActions.RemoteVideoDisabled} actionData
*/
remoteVideoDisabled: function(actionData) {
this.setStoreState({
remoteVideoEnabled: false,
remoteSrcVideoObject: undefined});
},
/**
* Handles when the local stream is supplied.
*
* XXX should write a localVideoDisabled action in otSdkDriver.js to
* positively ensure proper cleanup (handled by window teardown currently)
* (see bug 1171978)
*
* @param {sharedActions.LocalVideoEnabled} actionData
*/
localVideoEnabled: function(actionData) {
this.setStoreState({
localSrcVideoObject: actionData.srcVideoObject
});
},
/**
* Called when the window is unloaded, either by code, or by the user
* explicitly closing it. Expected to do any necessary housekeeping, such

View File

@ -104,9 +104,6 @@ loop.OTSdkDriver = (function() {
* with the action. See action.js.
*/
setupStreamElements: function(actionData) {
this.getLocalElement = actionData.getLocalElementFunc;
this.getScreenShareElementFunc = actionData.getScreenShareElementFunc;
this.getRemoteElement = actionData.getRemoteElementFunc;
this.publisherConfig = actionData.publisherConfig;
this.sdk.on("exception", this._onOTException.bind(this));
@ -122,8 +119,13 @@ loop.OTSdkDriver = (function() {
* XXX This can be simplified when bug 1138851 is actioned.
*/
_publishLocalStreams: function() {
this.publisher = this.sdk.initPublisher(this.getLocalElement(),
// We expect the local video to be muted automatically by the SDK. Hence
// we don't mute it manually here.
this._mockPublisherEl = document.createElement("div");
this.publisher = this.sdk.initPublisher(this._mockPublisherEl,
_.extend(this._getDataChannelSettings, this._getCopyPublisherConfig));
this.publisher.on("streamCreated", this._onLocalStreamCreated.bind(this));
this.publisher.on("streamDestroyed", this._onLocalStreamDestroyed.bind(this));
this.publisher.on("accessAllowed", this._onPublishComplete.bind(this));
@ -182,7 +184,9 @@ loop.OTSdkDriver = (function() {
var config = _.extend(this._getCopyPublisherConfig, options);
this.screenshare = this.sdk.initPublisher(this.getScreenShareElementFunc(),
this._mockScreenSharePreviewEl = document.createElement("div");
this.screenshare = this.sdk.initPublisher(this._mockScreenSharePreviewEl,
config);
this.screenshare.on("accessAllowed", this._onScreenShareGranted.bind(this));
this.screenshare.on("accessDenied", this._onScreenShareDenied.bind(this));
@ -209,7 +213,7 @@ loop.OTSdkDriver = (function() {
* Ends an active screenshare session. Return `true` when an active screen-
* sharing session was ended or `false` when no session is active.
*
* @type {Boolean}
* @returns {Boolean}
*/
endScreenShare: function() {
if (!this.screenshare) {
@ -222,6 +226,7 @@ loop.OTSdkDriver = (function() {
this.screenshare.off("accessAllowed accessDenied streamCreated");
this.screenshare.destroy();
delete this.screenshare;
delete this._mockScreenSharePreviewEl;
this._noteSharingState(this._windowId ? "browser" : "window", false);
delete this._windowId;
return true;
@ -289,6 +294,7 @@ loop.OTSdkDriver = (function() {
delete this._publisherReady;
delete this._publishedLocalStream;
delete this._subscribedRemoteStream;
delete this._mockPublisherEl;
this.connections = {};
this._setTwoWayMediaStartTime(this.CONNECTION_START_TIME_UNINITIALIZED);
},
@ -499,19 +505,23 @@ loop.OTSdkDriver = (function() {
* https://tokbox.com/opentok/libraries/client/js/reference/Stream.html
*/
_handleRemoteScreenShareCreated: function(stream) {
if (!this.getScreenShareElementFunc) {
return;
}
// Let the stores know first so they can update the display.
this.dispatcher.dispatch(new sharedActions.ReceivingScreenShare({
receiving: true
}));
// XXX We do want to do this - we want them to start re-arranging the
// display so that we can a) indicate connecting, b) be ready for
// when we get the stream. However, we're currently limited by the fact
// the view calculations require the remote (aka screen share) element to
// be present and laid out. Hence, we need to drop this for the time being,
// and let the client know via _onScreenShareSubscribeCompleted.
// Bug 1171933 is going to look at fixing this.
// this.dispatcher.dispatch(new sharedActions.ReceivingScreenShare({
// receiving: true
// }));
var remoteElement = this.getScreenShareElementFunc();
this.session.subscribe(stream,
remoteElement, this._getCopyPublisherConfig);
// There's no audio for screen shares so we don't need to worry about mute.
this._mockScreenShareEl = document.createElement("div");
this.session.subscribe(stream, this._mockScreenShareEl,
this._getCopyPublisherConfig,
this._onScreenShareSubscribeCompleted.bind(this));
},
/**
@ -536,17 +546,88 @@ loop.OTSdkDriver = (function() {
return;
}
var remoteElement = this.getRemoteElement();
// Setting up the subscribe might want to be before the VideoDimensionsChange
// dispatch. If so, we might also want to consider moving the dispatch to
// _onSubscribeCompleted. However, this seems to work fine at the moment,
// so we haven't felt the need to move it.
// XXX This mock element currently handles playing audio for the session.
// We might want to consider making the react tree responsible for playing
// the audio, so that the incoming audio could be disable/tracked easly from
// the UI (bug 1171896).
this._mockSubscribeEl = document.createElement("div");
this.subscriber = this.session.subscribe(event.stream,
remoteElement, this._getCopyPublisherConfig,
this._onRemoteSessionSubscribed.bind(this, event.stream.connection));
this._mockSubscribeEl, this._getCopyPublisherConfig,
this._onSubscribeCompleted.bind(this));
},
/**
* This method is passed as the "completionHandler" parameter to the SDK's
* Session.subscribe.
*
* @param err {(null|Error)} - null on success, an Error object otherwise
* @param sdkSubscriberObject {OT.Subscriber} - undocumented; returned on success
* @param subscriberVideo {HTMLVideoElement} - used for unit testing
*/
_onSubscribeCompleted: function(err, sdkSubscriberObject, subscriberVideo) {
// XXX test for and handle errors better (bug 1172140)
if (err) {
console.log("subscribe error:", err);
return;
}
var sdkSubscriberVideo = subscriberVideo ? subscriberVideo :
this._mockSubscribeEl.querySelector("video");
if (!sdkSubscriberVideo) {
console.error("sdkSubscriberVideo unexpectedly falsy!");
}
sdkSubscriberObject.on("videoEnabled", this._onVideoEnabled.bind(this));
sdkSubscriberObject.on("videoDisabled", this._onVideoDisabled.bind(this));
// XXX for some reason, the SDK deliberately suppresses sending the
// videoEnabled event after subscribe, in spite of docs claiming
// otherwise, so we do it ourselves.
if (sdkSubscriberObject.stream.hasVideo) {
this.dispatcher.dispatch(new sharedActions.RemoteVideoEnabled({
srcVideoObject: sdkSubscriberVideo}));
}
this._subscribedRemoteStream = true;
if (this._checkAllStreamsConnected()) {
this._setTwoWayMediaStartTime(performance.now());
this.dispatcher.dispatch(new sharedActions.MediaConnected());
}
this._setupDataChannelIfNeeded(sdkSubscriberObject.stream.connection);
},
/**
* This method is passed as the "completionHandler" parameter to the SDK's
* Session.subscribe.
*
* @param err {(null|Error)} - null on success, an Error object otherwise
* @param sdkSubscriberObject {OT.Subscriber} - undocumented; returned on success
* @param subscriberVideo {HTMLVideoElement} - used for unit testing
*/
_onScreenShareSubscribeCompleted: function(err, sdkSubscriberObject, subscriberVideo) {
// XXX test for and handle errors better
if (err) {
console.log("subscribe error:", err);
return;
}
var sdkSubscriberVideo = subscriberVideo ? subscriberVideo :
this._mockScreenShareEl.querySelector("video");
// XXX no idea why this is necessary in addition to the dispatch in
// _handleRemoteScreenShareCreated. Maybe these should be separate
// actions. But even so, this shouldn't be necessary....
this.dispatcher.dispatch(new sharedActions.ReceivingScreenShare({
receiving: true, srcVideoObject: sdkSubscriberVideo
}));
},
/**
@ -554,16 +635,11 @@ loop.OTSdkDriver = (function() {
* channel set-up routines. A data channel cannot be requested before this
* time as the peer connection is not set up.
*
* @param {OT.Connection} connection The OT connection class object.
* @param {OT.Error} err Indicates if there's been an error in
* completing the subscribe.
* @param {OT.Connection} connection The OT connection class object.paul
* sched
*
*/
_onRemoteSessionSubscribed: function(connection, err) {
if (err) {
console.error(err);
return;
}
_setupDataChannelIfNeeded: function(connection) {
if (this._useDataChannels) {
this.session.signal({
type: "readyForDataChannel",
@ -670,6 +746,12 @@ loop.OTSdkDriver = (function() {
this._notifyMetricsEvent("Publisher.streamCreated");
if (event.stream[STREAM_PROPERTIES.HAS_VIDEO]) {
var sdkLocalVideo = this._mockPublisherEl.querySelector("video");
this.dispatcher.dispatch(new sharedActions.LocalVideoEnabled(
{srcVideoObject: sdkLocalVideo}));
this.dispatcher.dispatch(new sharedActions.VideoDimensionsChanged({
isLocal: true,
videoType: event.stream.videoType,
@ -739,6 +821,7 @@ loop.OTSdkDriver = (function() {
this._notifyMetricsEvent("Session.streamDestroyed");
if (event.stream.videoType !== "screen") {
delete this._mockSubscribeEl;
return;
}
@ -747,6 +830,8 @@ loop.OTSdkDriver = (function() {
this.dispatcher.dispatch(new sharedActions.ReceivingScreenShare({
receiving: false
}));
delete this._mockScreenShareEl;
},
/**
@ -754,6 +839,7 @@ loop.OTSdkDriver = (function() {
*/
_onLocalStreamDestroyed: function() {
this._notifyMetricsEvent("Publisher.streamDestroyed");
delete this._mockPublisherEl;
},
/**
@ -793,6 +879,8 @@ loop.OTSdkDriver = (function() {
this.dispatcher.dispatch(new sharedActions.ConnectionFailure({
reason: FAILURE_DETAILS.MEDIA_DENIED
}));
delete this._mockPublisherEl;
},
_onOTException: function(event) {
@ -804,6 +892,7 @@ loop.OTSdkDriver = (function() {
this.publisher.off("accessAllowed accessDenied accessDialogOpened streamCreated");
this.publisher.destroy();
delete this.publisher;
delete this._mockPublisherEl;
}
this.dispatcher.dispatch(new sharedActions.ConnectionFailure({
reason: FAILURE_DETAILS.UNABLE_TO_PUBLISH_MEDIA
@ -824,6 +913,42 @@ loop.OTSdkDriver = (function() {
}
},
/**
* Handle the (remote) VideoEnabled event from the subscriber object
* by dispatching an action with the (hidden) video element from
* which to copy the stream when attaching it to visible video element
* that the views control directly.
*
* @param event {OT.VideoEnabledChangedEvent} from the SDK
*
* @see https://tokbox.com/opentok/libraries/client/js/reference/VideoEnabledChangedEvent.html
* @private
*/
_onVideoEnabled: function(event) {
var sdkSubscriberVideo = this._mockSubscribeEl.querySelector("video");
if (!sdkSubscriberVideo) {
console.error("sdkSubscriberVideo unexpectedly falsy!");
}
this.dispatcher.dispatch(
new sharedActions.RemoteVideoEnabled(
{srcVideoObject: sdkSubscriberVideo}));
},
/**
* Handle the SDK disabling of remote video by dispatching the
* appropriate event.
*
* @param event {OT.VideoEnabledChangedEvent) from the SDK
*
* @see https://tokbox.com/opentok/libraries/client/js/reference/VideoEnabledChangedEvent.html
* @private
*/
_onVideoDisabled: function(event) {
this.dispatcher.dispatch(
new sharedActions.RemoteVideoDisabled());
},
/**
* Publishes the local stream if the session is connected
* and the publisher is ready.
@ -868,6 +993,7 @@ loop.OTSdkDriver = (function() {
this.dispatcher.dispatch(new sharedActions.ScreenSharingState({
state: SCREEN_SHARE_STATES.INACTIVE
}));
delete this._mockScreenSharePreviewEl;
},
/**

View File

@ -678,13 +678,132 @@ loop.shared.views = (function(_, l10n) {
}
});
/**
* Renders an avatar element for display when video is muted.
*/
var AvatarView = React.createClass({displayName: "AvatarView",
mixins: [React.addons.PureRenderMixin],
render: function() {
return React.createElement("div", {className: "avatar"});
}
});
/**
* Renders a media element for display. This also handles displaying an avatar
* instead of the video, and attaching a video stream to the video element.
*/
var MediaView = React.createClass({displayName: "MediaView",
// srcVideoObject should be ok for a shallow comparison, so we are safe
// to use the pure render mixin here.
mixins: [React.addons.PureRenderMixin],
PropTypes: {
displayAvatar: React.PropTypes.bool.isRequired,
posterUrl: React.PropTypes.string,
// Expecting "local" or "remote".
mediaType: React.PropTypes.string.isRequired,
srcVideoObject: React.PropTypes.object
},
componentDidMount: function() {
if (!this.props.displayAvatar) {
this.attachVideo(this.props.srcVideoObject);
}
},
componentDidUpdate: function() {
if (!this.props.displayAvatar) {
this.attachVideo(this.props.srcVideoObject);
}
},
/**
* Attaches a video stream from a donor video element to this component's
* video element if the component is displaying one.
*
* @param {Object} srcVideoObject The src video object to clone the stream
* from.
*
* XXX need to have a corresponding detachVideo or change this to syncVideo
* to protect from leaks (bug 1171978)
*/
attachVideo: function(srcVideoObject) {
if (!srcVideoObject) {
// Not got anything to display.
return;
}
var videoElement = this.getDOMNode();
if (videoElement.tagName.toLowerCase() !== "video") {
// Must be displaying the avatar view, so don't try and attach video.
return;
}
// Set the src of our video element
var attrName = "";
if ("srcObject" in videoElement) {
// srcObject is according to the standard.
attrName = "srcObject";
} else if ("mozSrcObject" in videoElement) {
// mozSrcObject is for Firefox
attrName = "mozSrcObject";
} else if ("src" in videoElement) {
// src is for Chrome.
attrName = "src";
} else {
console.error("Error attaching stream to element - no supported attribute found");
return;
}
// If the object hasn't changed it, then don't reattach it.
if (videoElement[attrName] !== srcVideoObject[attrName]) {
videoElement[attrName] = srcVideoObject[attrName];
}
videoElement.play();
},
render: function() {
if (this.props.displayAvatar) {
return React.createElement(AvatarView, null);
}
if (!this.props.srcVideoObject && !this.props.posterUrl) {
return React.createElement("div", {className: "no-video"});
}
var optionalPoster = {};
if (this.props.posterUrl) {
optionalPoster.poster = this.props.posterUrl;
}
// For now, always mute media. For local media, we should be muted anyway,
// as we don't want to hear ourselves speaking.
//
// For remote media, we would ideally have this live video element in
// control of the audio, but due to the current method of not rendering
// the element at all when video is muted we have to rely on the hidden
// dom element in the sdk driver to play the audio.
// We might want to consider changing this if we add UI controls relating
// to the remote audio at some stage in the future.
return (
React.createElement("video", React.__spread({}, optionalPoster,
{className: this.props.mediaType + "-video",
muted: true}))
);
}
});
return {
AvatarView: AvatarView,
Button: Button,
ButtonGroup: ButtonGroup,
Checkbox: Checkbox,
ConversationView: ConversationView,
ConversationToolbar: ConversationToolbar,
MediaControlButton: MediaControlButton,
MediaView: MediaView,
ScreenShareControlButton: ScreenShareControlButton,
NotificationListView: NotificationListView
};

View File

@ -678,13 +678,132 @@ loop.shared.views = (function(_, l10n) {
}
});
/**
* Renders an avatar element for display when video is muted.
*/
var AvatarView = React.createClass({
mixins: [React.addons.PureRenderMixin],
render: function() {
return <div className="avatar"/>;
}
});
/**
* Renders a media element for display. This also handles displaying an avatar
* instead of the video, and attaching a video stream to the video element.
*/
var MediaView = React.createClass({
// srcVideoObject should be ok for a shallow comparison, so we are safe
// to use the pure render mixin here.
mixins: [React.addons.PureRenderMixin],
PropTypes: {
displayAvatar: React.PropTypes.bool.isRequired,
posterUrl: React.PropTypes.string,
// Expecting "local" or "remote".
mediaType: React.PropTypes.string.isRequired,
srcVideoObject: React.PropTypes.object
},
componentDidMount: function() {
if (!this.props.displayAvatar) {
this.attachVideo(this.props.srcVideoObject);
}
},
componentDidUpdate: function() {
if (!this.props.displayAvatar) {
this.attachVideo(this.props.srcVideoObject);
}
},
/**
* Attaches a video stream from a donor video element to this component's
* video element if the component is displaying one.
*
* @param {Object} srcVideoObject The src video object to clone the stream
* from.
*
* XXX need to have a corresponding detachVideo or change this to syncVideo
* to protect from leaks (bug 1171978)
*/
attachVideo: function(srcVideoObject) {
if (!srcVideoObject) {
// Not got anything to display.
return;
}
var videoElement = this.getDOMNode();
if (videoElement.tagName.toLowerCase() !== "video") {
// Must be displaying the avatar view, so don't try and attach video.
return;
}
// Set the src of our video element
var attrName = "";
if ("srcObject" in videoElement) {
// srcObject is according to the standard.
attrName = "srcObject";
} else if ("mozSrcObject" in videoElement) {
// mozSrcObject is for Firefox
attrName = "mozSrcObject";
} else if ("src" in videoElement) {
// src is for Chrome.
attrName = "src";
} else {
console.error("Error attaching stream to element - no supported attribute found");
return;
}
// If the object hasn't changed it, then don't reattach it.
if (videoElement[attrName] !== srcVideoObject[attrName]) {
videoElement[attrName] = srcVideoObject[attrName];
}
videoElement.play();
},
render: function() {
if (this.props.displayAvatar) {
return <AvatarView />;
}
if (!this.props.srcVideoObject && !this.props.posterUrl) {
return <div className="no-video"/>;
}
var optionalPoster = {};
if (this.props.posterUrl) {
optionalPoster.poster = this.props.posterUrl;
}
// For now, always mute media. For local media, we should be muted anyway,
// as we don't want to hear ourselves speaking.
//
// For remote media, we would ideally have this live video element in
// control of the audio, but due to the current method of not rendering
// the element at all when video is muted we have to rely on the hidden
// dom element in the sdk driver to play the audio.
// We might want to consider changing this if we add UI controls relating
// to the remote audio at some stage in the future.
return (
<video {...optionalPoster}
className={this.props.mediaType + "-video"}
muted />
);
}
});
return {
AvatarView: AvatarView,
Button: Button,
ButtonGroup: ButtonGroup,
Checkbox: Checkbox,
ConversationView: ConversationView,
ConversationToolbar: ConversationToolbar,
MediaControlButton: MediaControlButton,
MediaView: MediaView,
ScreenShareControlButton: ScreenShareControlButton,
NotificationListView: NotificationListView
};

View File

@ -131,7 +131,6 @@ loop.StandaloneMozLoop = (function(mozL10n) {
},
async: async,
success: function(responseData) {
console.log("done");
try {
callback(null, validate(responseData, expectedProps));
} catch (err) {

View File

@ -337,7 +337,11 @@ loop.standaloneRoomViews = (function(mozL10n) {
React.PropTypes.instanceOf(loop.store.FxOSActiveRoomStore)
]).isRequired,
dispatcher: React.PropTypes.instanceOf(loop.Dispatcher).isRequired,
isFirefox: React.PropTypes.bool.isRequired
isFirefox: React.PropTypes.bool.isRequired,
// The poster URLs are for UI-showcase testing and development
localPosterUrl: React.PropTypes.string,
remotePosterUrl: React.PropTypes.string,
screenSharePosterUrl: React.PropTypes.string
},
getInitialState: function() {
@ -385,10 +389,7 @@ loop.standaloneRoomViews = (function(mozL10n) {
if (this.state.roomState !== ROOM_STATES.MEDIA_WAIT &&
nextState.roomState === ROOM_STATES.MEDIA_WAIT) {
this.props.dispatcher.dispatch(new sharedActions.SetupStreamElements({
publisherConfig: this.getDefaultPublisherConfig({publishVideo: true}),
getLocalElementFunc: this._getElement.bind(this, ".local"),
getRemoteElementFunc: this._getElement.bind(this, ".remote"),
getScreenShareElementFunc: this._getElement.bind(this, ".screen")
publisherConfig: this.getDefaultPublisherConfig({publishVideo: true})
}));
}
@ -411,8 +412,10 @@ loop.standaloneRoomViews = (function(mozL10n) {
// Remove the custom screenshare styles on the remote camera.
var node = this._getElement(".remote");
node.removeAttribute("style");
}
// Force the video sizes to update.
if (this.state.receivingScreenShare != nextState.receivingScreenShare ||
this.state.remoteVideoEnabled != nextState.remoteVideoEnabled) {
this.updateVideoContainer();
}
},
@ -425,6 +428,32 @@ loop.standaloneRoomViews = (function(mozL10n) {
this.props.dispatcher.dispatch(new sharedActions.LeaveRoom());
},
/**
* Wrapper for window.matchMedia so that we use an appropriate version
* for the ui-showcase, which puts views inside of their own iframes.
*
* Currently, we use an icky hack, and the showcase conspires with
* react-frame-component to set iframe.contentWindow.matchMedia onto
* activeRoomStore. Once React context matures a bit (somewhere between
* 0.14 and 1.0, apparently):
*
* https://facebook.github.io/react/blog/2015/02/24/streamlining-react-elements.html#solution-make-context-parent-based-instead-of-owner-based
*
* we should be able to use those to clean this up.
*
* @param queryString
* @returns {MediaQueryList|null}
* @private
*/
_matchMedia: function(queryString) {
if ("matchMedia" in this.state) {
return this.state.matchMedia(queryString);
} else if ("matchMedia" in window) {
return window.matchMedia(queryString);
}
return null;
},
/**
* Toggles streaming status for a given stream type.
*
@ -458,7 +487,7 @@ loop.standaloneRoomViews = (function(mozL10n) {
var targetWidth;
node.style.right = "auto";
if (window.matchMedia && window.matchMedia("screen and (max-width:640px)").matches) {
if (this._matchMedia("screen and (max-width:640px)").matches) {
// For reduced screen widths, we just go for a fixed size and no overlap.
targetWidth = 180;
node.style.width = (targetWidth * ratio.width) + "px";
@ -470,8 +499,25 @@ loop.standaloneRoomViews = (function(mozL10n) {
// Now position the local camera view correctly with respect to the remote
// video stream or the screen share stream.
var remoteVideoDimensions = this.getRemoteVideoDimensions(
this.state.receivingScreenShare ? "screen" : "camera");
var remoteVideoDimensions;
var isScreenShare = this.state.receivingScreenShare;
var videoDisplayed = isScreenShare ?
this.state.screenShareVideoObject || this.props.screenSharePosterUrl :
this.state.remoteSrcVideoObject || this.props.remotePosterUrl;
if ((isScreenShare || this.shouldRenderRemoteVideo()) && videoDisplayed) {
remoteVideoDimensions = this.getRemoteVideoDimensions(
isScreenShare ? "screen" : "camera");
} else {
var remoteElement = this.getDOMNode().querySelector(".remote.focus-stream");
if (!remoteElement) {
return;
}
remoteVideoDimensions = {
streamWidth: remoteElement.offsetWidth,
offsetX: remoteElement.offsetLeft
};
}
targetWidth = remoteVideoDimensions.streamWidth * LOCAL_STREAM_SIZE;
@ -515,7 +561,7 @@ loop.standaloneRoomViews = (function(mozL10n) {
}
// XXX For the time being, if we're a narrow screen, aka mobile, we don't display
// the remote media (bug 1133534).
if (window.matchMedia && window.matchMedia("screen and (max-width:640px)").matches) {
if (this._matchMedia("screen and (max-width:640px)").matches) {
return;
}
@ -557,9 +603,51 @@ loop.standaloneRoomViews = (function(mozL10n) {
this.state.roomState === ROOM_STATES.HAS_PARTICIPANTS;
},
/**
* Works out if remote video should be rended or not, depending on the
* room state and other flags.
*
* @return {Boolean} True if remote video should be rended.
*/
shouldRenderRemoteVideo: function() {
switch(this.state.roomState) {
case ROOM_STATES.HAS_PARTICIPANTS:
if (this.state.remoteVideoEnabled) {
return true;
}
if (this.state.mediaConnected) {
// since the remoteVideo hasn't yet been enabled, if the
// media is connected, then we should be displaying an avatar.
return false;
}
return true;
case ROOM_STATES.READY:
case ROOM_STATES.INIT:
case ROOM_STATES.JOINING:
case ROOM_STATES.SESSION_CONNECTED:
case ROOM_STATES.JOINED:
case ROOM_STATES.MEDIA_WAIT:
// this case is so that we don't show an avatar while waiting for
// the other party to connect
return true;
case ROOM_STATES.CLOSING:
// the other person has shown up, so we don't want to show an avatar
return true;
default:
console.warn("StandaloneRoomView.shouldRenderRemoteVideo:" +
" unexpected roomState: ", this.state.roomState);
return true;
}
},
render: function() {
var localStreamClasses = React.addons.classSet({
hide: !this._roomIsActive(),
local: true,
"local-stream": true,
"local-stream-audio": this.state.videoMuted
@ -602,10 +690,25 @@ loop.standaloneRoomViews = (function(mozL10n) {
mozL10n.get("self_view_hidden_message")
),
React.createElement("div", {className: "video_wrapper remote_wrapper"},
React.createElement("div", {className: remoteStreamClasses}),
React.createElement("div", {className: screenShareStreamClasses})
React.createElement("div", {className: remoteStreamClasses},
React.createElement(sharedViews.MediaView, {displayAvatar: !this.shouldRenderRemoteVideo(),
posterUrl: this.props.remotePosterUrl,
mediaType: "remote",
srcVideoObject: this.state.remoteSrcVideoObject})
),
React.createElement("div", {className: screenShareStreamClasses},
React.createElement(sharedViews.MediaView, {displayAvatar: false,
posterUrl: this.props.screenSharePosterUrl,
mediaType: "screen-share",
srcVideoObject: this.state.screenShareVideoObject})
)
),
React.createElement("div", {className: localStreamClasses})
React.createElement("div", {className: localStreamClasses},
React.createElement(sharedViews.MediaView, {displayAvatar: this.state.videoMuted,
posterUrl: this.props.localPosterUrl,
mediaType: "local",
srcVideoObject: this.state.localSrcVideoObject})
)
),
React.createElement(sharedViews.ConversationToolbar, {
dispatcher: this.props.dispatcher,

View File

@ -337,7 +337,11 @@ loop.standaloneRoomViews = (function(mozL10n) {
React.PropTypes.instanceOf(loop.store.FxOSActiveRoomStore)
]).isRequired,
dispatcher: React.PropTypes.instanceOf(loop.Dispatcher).isRequired,
isFirefox: React.PropTypes.bool.isRequired
isFirefox: React.PropTypes.bool.isRequired,
// The poster URLs are for UI-showcase testing and development
localPosterUrl: React.PropTypes.string,
remotePosterUrl: React.PropTypes.string,
screenSharePosterUrl: React.PropTypes.string
},
getInitialState: function() {
@ -385,10 +389,7 @@ loop.standaloneRoomViews = (function(mozL10n) {
if (this.state.roomState !== ROOM_STATES.MEDIA_WAIT &&
nextState.roomState === ROOM_STATES.MEDIA_WAIT) {
this.props.dispatcher.dispatch(new sharedActions.SetupStreamElements({
publisherConfig: this.getDefaultPublisherConfig({publishVideo: true}),
getLocalElementFunc: this._getElement.bind(this, ".local"),
getRemoteElementFunc: this._getElement.bind(this, ".remote"),
getScreenShareElementFunc: this._getElement.bind(this, ".screen")
publisherConfig: this.getDefaultPublisherConfig({publishVideo: true})
}));
}
@ -411,8 +412,10 @@ loop.standaloneRoomViews = (function(mozL10n) {
// Remove the custom screenshare styles on the remote camera.
var node = this._getElement(".remote");
node.removeAttribute("style");
}
// Force the video sizes to update.
if (this.state.receivingScreenShare != nextState.receivingScreenShare ||
this.state.remoteVideoEnabled != nextState.remoteVideoEnabled) {
this.updateVideoContainer();
}
},
@ -425,6 +428,32 @@ loop.standaloneRoomViews = (function(mozL10n) {
this.props.dispatcher.dispatch(new sharedActions.LeaveRoom());
},
/**
* Wrapper for window.matchMedia so that we use an appropriate version
* for the ui-showcase, which puts views inside of their own iframes.
*
* Currently, we use an icky hack, and the showcase conspires with
* react-frame-component to set iframe.contentWindow.matchMedia onto
* activeRoomStore. Once React context matures a bit (somewhere between
* 0.14 and 1.0, apparently):
*
* https://facebook.github.io/react/blog/2015/02/24/streamlining-react-elements.html#solution-make-context-parent-based-instead-of-owner-based
*
* we should be able to use those to clean this up.
*
* @param queryString
* @returns {MediaQueryList|null}
* @private
*/
_matchMedia: function(queryString) {
if ("matchMedia" in this.state) {
return this.state.matchMedia(queryString);
} else if ("matchMedia" in window) {
return window.matchMedia(queryString);
}
return null;
},
/**
* Toggles streaming status for a given stream type.
*
@ -458,7 +487,7 @@ loop.standaloneRoomViews = (function(mozL10n) {
var targetWidth;
node.style.right = "auto";
if (window.matchMedia && window.matchMedia("screen and (max-width:640px)").matches) {
if (this._matchMedia("screen and (max-width:640px)").matches) {
// For reduced screen widths, we just go for a fixed size and no overlap.
targetWidth = 180;
node.style.width = (targetWidth * ratio.width) + "px";
@ -470,8 +499,25 @@ loop.standaloneRoomViews = (function(mozL10n) {
// Now position the local camera view correctly with respect to the remote
// video stream or the screen share stream.
var remoteVideoDimensions = this.getRemoteVideoDimensions(
this.state.receivingScreenShare ? "screen" : "camera");
var remoteVideoDimensions;
var isScreenShare = this.state.receivingScreenShare;
var videoDisplayed = isScreenShare ?
this.state.screenShareVideoObject || this.props.screenSharePosterUrl :
this.state.remoteSrcVideoObject || this.props.remotePosterUrl;
if ((isScreenShare || this.shouldRenderRemoteVideo()) && videoDisplayed) {
remoteVideoDimensions = this.getRemoteVideoDimensions(
isScreenShare ? "screen" : "camera");
} else {
var remoteElement = this.getDOMNode().querySelector(".remote.focus-stream");
if (!remoteElement) {
return;
}
remoteVideoDimensions = {
streamWidth: remoteElement.offsetWidth,
offsetX: remoteElement.offsetLeft
};
}
targetWidth = remoteVideoDimensions.streamWidth * LOCAL_STREAM_SIZE;
@ -515,7 +561,7 @@ loop.standaloneRoomViews = (function(mozL10n) {
}
// XXX For the time being, if we're a narrow screen, aka mobile, we don't display
// the remote media (bug 1133534).
if (window.matchMedia && window.matchMedia("screen and (max-width:640px)").matches) {
if (this._matchMedia("screen and (max-width:640px)").matches) {
return;
}
@ -557,9 +603,51 @@ loop.standaloneRoomViews = (function(mozL10n) {
this.state.roomState === ROOM_STATES.HAS_PARTICIPANTS;
},
/**
* Works out if remote video should be rended or not, depending on the
* room state and other flags.
*
* @return {Boolean} True if remote video should be rended.
*/
shouldRenderRemoteVideo: function() {
switch(this.state.roomState) {
case ROOM_STATES.HAS_PARTICIPANTS:
if (this.state.remoteVideoEnabled) {
return true;
}
if (this.state.mediaConnected) {
// since the remoteVideo hasn't yet been enabled, if the
// media is connected, then we should be displaying an avatar.
return false;
}
return true;
case ROOM_STATES.READY:
case ROOM_STATES.INIT:
case ROOM_STATES.JOINING:
case ROOM_STATES.SESSION_CONNECTED:
case ROOM_STATES.JOINED:
case ROOM_STATES.MEDIA_WAIT:
// this case is so that we don't show an avatar while waiting for
// the other party to connect
return true;
case ROOM_STATES.CLOSING:
// the other person has shown up, so we don't want to show an avatar
return true;
default:
console.warn("StandaloneRoomView.shouldRenderRemoteVideo:" +
" unexpected roomState: ", this.state.roomState);
return true;
}
},
render: function() {
var localStreamClasses = React.addons.classSet({
hide: !this._roomIsActive(),
local: true,
"local-stream": true,
"local-stream-audio": this.state.videoMuted
@ -602,10 +690,25 @@ loop.standaloneRoomViews = (function(mozL10n) {
{mozL10n.get("self_view_hidden_message")}
</span>
<div className="video_wrapper remote_wrapper">
<div className={remoteStreamClasses}></div>
<div className={screenShareStreamClasses}></div>
<div className={remoteStreamClasses}>
<sharedViews.MediaView displayAvatar={!this.shouldRenderRemoteVideo()}
posterUrl={this.props.remotePosterUrl}
mediaType="remote"
srcVideoObject={this.state.remoteSrcVideoObject} />
</div>
<div className={screenShareStreamClasses}>
<sharedViews.MediaView displayAvatar={false}
posterUrl={this.props.screenSharePosterUrl}
mediaType="screen-share"
srcVideoObject={this.state.screenShareVideoObject} />
</div>
</div>
<div className={localStreamClasses}>
<sharedViews.MediaView displayAvatar={this.state.videoMuted}
posterUrl={this.props.localPosterUrl}
mediaType="local"
srcVideoObject={this.state.localSrcVideoObject} />
</div>
<div className={localStreamClasses}></div>
</div>
<sharedViews.ConversationToolbar
dispatcher={this.props.dispatcher}

View File

@ -8,8 +8,8 @@ describe("loop.conversationViews", function () {
var TestUtils = React.addons.TestUtils;
var sharedActions = loop.shared.actions;
var sharedUtils = loop.shared.utils;
var sharedView = loop.shared.views;
var sandbox, view, dispatcher, contact, fakeAudioXHR;
var sharedViews = loop.shared.views;
var sandbox, view, dispatcher, contact, fakeAudioXHR, conversationStore;
var fakeMozLoop, fakeWindow;
var CALL_STATES = loop.store.CALL_STATES;
@ -104,6 +104,19 @@ describe("loop.conversationViews", function () {
};
loop.shared.mixins.setRootObject(fakeWindow);
var feedbackStore = new loop.store.FeedbackStore(dispatcher, {
feedbackClient: {}
});
conversationStore = new loop.store.ConversationStore(dispatcher, {
client: {},
mozLoop: fakeMozLoop,
sdkDriver: {}
});
loop.store.StoreMixin.register({
conversationStore: conversationStore,
feedbackStore: feedbackStore
});
});
afterEach(function() {
@ -255,7 +268,7 @@ describe("loop.conversationViews", function () {
});
describe("CallFailedView", function() {
var store, fakeAudio;
var fakeAudio;
var contact = {email: [{value: "test@test.tld"}]};
@ -269,15 +282,6 @@ describe("loop.conversationViews", function () {
}
beforeEach(function() {
store = new loop.store.ConversationStore(dispatcher, {
client: {},
mozLoop: navigator.mozLoop,
sdkDriver: {}
});
loop.store.StoreMixin.register({
conversationStore: store
});
fakeAudio = {
play: sinon.spy(),
pause: sinon.spy(),
@ -357,7 +361,7 @@ describe("loop.conversationViews", function () {
it("should compose an email once the email link is received", function() {
var composeCallUrlEmail = sandbox.stub(sharedUtils, "composeCallUrlEmail");
view = mountTestComponent({contact: contact});
store.setStoreState({emailLink: "http://fake.invalid/"});
conversationStore.setStoreState({emailLink: "http://fake.invalid/"});
sinon.assert.calledOnce(composeCallUrlEmail);
sinon.assert.calledWithExactly(composeCallUrlEmail,
@ -368,7 +372,7 @@ describe("loop.conversationViews", function () {
function() {
view = mountTestComponent({contact: contact});
store.setStoreState({emailLink: "http://fake.invalid/"});
conversationStore.setStoreState({emailLink: "http://fake.invalid/"});
sinon.assert.calledOnce(fakeWindow.close);
});
@ -377,7 +381,7 @@ describe("loop.conversationViews", function () {
function() {
view = mountTestComponent({contact: contact});
store.trigger("error:emailLink");
conversationStore.trigger("error:emailLink");
expect(view.getDOMNode().querySelector(".error")).not.eql(null);
});
@ -386,7 +390,7 @@ describe("loop.conversationViews", function () {
function() {
view = mountTestComponent({contact: contact});
store.trigger("error:emailLink");
conversationStore.trigger("error:emailLink");
expect(view.getDOMNode().querySelector(".btn-email").disabled).eql(false);
});
@ -403,7 +407,7 @@ describe("loop.conversationViews", function () {
it("should show 'something went wrong' when the reason is WEBSOCKET_REASONS.MEDIA_FAIL",
function () {
store.setStoreState({callStateReason: WEBSOCKET_REASONS.MEDIA_FAIL});
conversationStore.setStoreState({callStateReason: WEBSOCKET_REASONS.MEDIA_FAIL});
view = mountTestComponent({contact: contact});
@ -412,7 +416,7 @@ describe("loop.conversationViews", function () {
it("should show 'contact unavailable' when the reason is WEBSOCKET_REASONS.REJECT",
function () {
store.setStoreState({callStateReason: WEBSOCKET_REASONS.REJECT});
conversationStore.setStoreState({callStateReason: WEBSOCKET_REASONS.REJECT});
view = mountTestComponent({contact: contact});
@ -423,7 +427,7 @@ describe("loop.conversationViews", function () {
it("should show 'contact unavailable' when the reason is WEBSOCKET_REASONS.BUSY",
function () {
store.setStoreState({callStateReason: WEBSOCKET_REASONS.BUSY});
conversationStore.setStoreState({callStateReason: WEBSOCKET_REASONS.BUSY});
view = mountTestComponent({contact: contact});
@ -434,7 +438,7 @@ describe("loop.conversationViews", function () {
it("should show 'something went wrong' when the reason is 'setup'",
function () {
store.setStoreState({callStateReason: "setup"});
conversationStore.setStoreState({callStateReason: "setup"});
view = mountTestComponent({contact: contact});
@ -444,7 +448,7 @@ describe("loop.conversationViews", function () {
it("should show 'contact unavailable' when the reason is REST_ERRNOS.USER_UNAVAILABLE",
function () {
store.setStoreState({callStateReason: REST_ERRNOS.USER_UNAVAILABLE});
conversationStore.setStoreState({callStateReason: REST_ERRNOS.USER_UNAVAILABLE});
view = mountTestComponent({contact: contact});
@ -455,7 +459,7 @@ describe("loop.conversationViews", function () {
it("should show 'no media' when the reason is FAILURE_DETAILS.UNABLE_TO_PUBLISH_MEDIA",
function () {
store.setStoreState({callStateReason: FAILURE_DETAILS.UNABLE_TO_PUBLISH_MEDIA});
conversationStore.setStoreState({callStateReason: FAILURE_DETAILS.UNABLE_TO_PUBLISH_MEDIA});
view = mountTestComponent({contact: contact});
@ -464,7 +468,7 @@ describe("loop.conversationViews", function () {
it("should display a generic contact unavailable msg when the reason is" +
" WEBSOCKET_REASONS.BUSY and no display name is available", function() {
store.setStoreState({callStateReason: WEBSOCKET_REASONS.BUSY});
conversationStore.setStoreState({callStateReason: WEBSOCKET_REASONS.BUSY});
var phoneOnlyContact = {
tel: [{"pref": true, type: "work", value: ""}]
};
@ -477,27 +481,72 @@ describe("loop.conversationViews", function () {
});
describe("OngoingConversationView", function() {
function mountTestComponent(props) {
function mountTestComponent(extraProps) {
var props = _.extend({
dispatcher: dispatcher
}, extraProps);
return TestUtils.renderIntoDocument(
React.createElement(loop.conversationViews.OngoingConversationView, props));
}
it("should dispatch a setupStreamElements action when the view is created",
function() {
view = mountTestComponent({
dispatcher: dispatcher
});
view = mountTestComponent();
sinon.assert.calledOnce(dispatcher.dispatch);
sinon.assert.calledWithMatch(dispatcher.dispatch,
sinon.match.hasOwn("name", "setupStreamElements"));
});
it("should display an avatar for remote video when the stream is not enabled", function() {
view = mountTestComponent({
mediaConnected: true,
remoteVideoEnabled: false
});
TestUtils.findRenderedComponentWithType(view, sharedViews.AvatarView);
});
it("should display the remote video when the stream is enabled", function() {
conversationStore.setStoreState({
remoteSrcVideoObject: { fake: 1 }
});
view = mountTestComponent({
mediaConnected: true,
remoteVideoEnabled: true
});
expect(view.getDOMNode().querySelector(".remote video")).not.eql(null);
});
it("should display an avatar for local video when the stream is not enabled", function() {
view = mountTestComponent({
video: {
enabled: false
}
});
TestUtils.findRenderedComponentWithType(view, sharedViews.AvatarView);
});
it("should display the local video when the stream is enabled", function() {
conversationStore.setStoreState({
localSrcVideoObject: { fake: 1 }
});
view = mountTestComponent({
video: {
enabled: true
}
});
expect(view.getDOMNode().querySelector(".local video")).not.eql(null);
});
it("should dispatch a hangupCall action when the hangup button is pressed",
function() {
view = mountTestComponent({
dispatcher: dispatcher
});
view = mountTestComponent();
var hangupBtn = view.getDOMNode().querySelector(".btn-hangup");
@ -510,7 +559,6 @@ describe("loop.conversationViews", function () {
it("should dispatch a setMute action when the audio mute button is pressed",
function() {
view = mountTestComponent({
dispatcher: dispatcher,
audio: {enabled: false}
});
@ -529,7 +577,6 @@ describe("loop.conversationViews", function () {
it("should dispatch a setMute action when the video mute button is pressed",
function() {
view = mountTestComponent({
dispatcher: dispatcher,
video: {enabled: true}
});
@ -547,7 +594,6 @@ describe("loop.conversationViews", function () {
it("should set the mute button as mute off", function() {
view = mountTestComponent({
dispatcher: dispatcher,
video: {enabled: true}
});
@ -558,7 +604,6 @@ describe("loop.conversationViews", function () {
it("should set the mute button as mute on", function() {
view = mountTestComponent({
dispatcher: dispatcher,
audio: {enabled: false}
});
@ -569,7 +614,7 @@ describe("loop.conversationViews", function () {
});
describe("CallControllerView", function() {
var store, feedbackStore;
var feedbackStore;
function mountTestComponent() {
return TestUtils.renderIntoDocument(
@ -580,22 +625,13 @@ describe("loop.conversationViews", function () {
}
beforeEach(function() {
store = new loop.store.ConversationStore(dispatcher, {
client: {},
mozLoop: fakeMozLoop,
sdkDriver: {}
});
loop.store.StoreMixin.register({
conversationStore: store
});
feedbackStore = new loop.store.FeedbackStore(dispatcher, {
feedbackClient: {}
});
});
it("should set the document title to the callerId", function() {
store.setStoreState({
conversationStore.setStoreState({
contact: contact
});
@ -606,7 +642,7 @@ describe("loop.conversationViews", function () {
it("should fallback to the contact email if the contact name is not defined", function() {
delete contact.name;
store.setStoreState({
conversationStore.setStoreState({
contact: contact
});
@ -616,7 +652,7 @@ describe("loop.conversationViews", function () {
});
it("should fallback to the caller id if no contact is defined", function() {
store.setStoreState({
conversationStore.setStoreState({
callerId: "fakeId"
});
@ -627,7 +663,7 @@ describe("loop.conversationViews", function () {
it("should render the CallFailedView when the call state is 'terminated'",
function() {
store.setStoreState({
conversationStore.setStoreState({
callState: CALL_STATES.TERMINATED,
contact: contact
});
@ -640,7 +676,7 @@ describe("loop.conversationViews", function () {
it("should render the PendingConversationView for outgoing calls when the call state is 'gather'",
function() {
store.setStoreState({
conversationStore.setStoreState({
callState: CALL_STATES.GATHER,
contact: contact,
outgoing: true
@ -653,7 +689,7 @@ describe("loop.conversationViews", function () {
});
it("should render the AcceptCallView for incoming calls when the call state is 'alerting'", function() {
store.setStoreState({
conversationStore.setStoreState({
callState: CALL_STATES.ALERTING,
outgoing: false
});
@ -666,7 +702,7 @@ describe("loop.conversationViews", function () {
it("should render the OngoingConversationView when the call state is 'ongoing'",
function() {
store.setStoreState({callState: CALL_STATES.ONGOING});
conversationStore.setStoreState({callState: CALL_STATES.ONGOING});
view = mountTestComponent();
@ -676,7 +712,7 @@ describe("loop.conversationViews", function () {
it("should render the FeedbackView when the call state is 'finished'",
function() {
store.setStoreState({callState: CALL_STATES.FINISHED});
conversationStore.setStoreState({callState: CALL_STATES.FINISHED});
view = mountTestComponent();
@ -685,7 +721,7 @@ describe("loop.conversationViews", function () {
});
it("should set the document title to conversation_has_ended when displaying the feedback view", function() {
store.setStoreState({callState: CALL_STATES.FINISHED});
conversationStore.setStoreState({callState: CALL_STATES.FINISHED});
mountTestComponent();
@ -701,7 +737,7 @@ describe("loop.conversationViews", function () {
};
sandbox.stub(window, "Audio").returns(fakeAudio);
store.setStoreState({callState: CALL_STATES.FINISHED});
conversationStore.setStoreState({callState: CALL_STATES.FINISHED});
view = mountTestComponent();
@ -710,7 +746,7 @@ describe("loop.conversationViews", function () {
it("should update the rendered views when the state is changed.",
function() {
store.setStoreState({
conversationStore.setStoreState({
callState: CALL_STATES.GATHER,
contact: contact,
outgoing: true
@ -721,7 +757,7 @@ describe("loop.conversationViews", function () {
TestUtils.findRenderedComponentWithType(view,
loop.conversationViews.PendingConversationView);
store.setStoreState({callState: CALL_STATES.TERMINATED});
conversationStore.setStoreState({callState: CALL_STATES.TERMINATED});
TestUtils.findRenderedComponentWithType(view,
loop.conversationViews.CallFailedView);

View File

@ -8,6 +8,7 @@ describe("loop.roomViews", function () {
var TestUtils = React.addons.TestUtils;
var sharedActions = loop.shared.actions;
var sharedUtils = loop.shared.utils;
var sharedViews = loop.shared.views;
var ROOM_STATES = loop.store.ROOM_STATES;
var SCREEN_SHARE_STATES = loop.shared.utils.SCREEN_SHARE_STATES;
@ -67,6 +68,13 @@ describe("loop.roomViews", function () {
mozLoop: fakeMozLoop,
activeRoomStore: activeRoomStore
});
var textChatStore = new loop.store.TextChatStore(dispatcher, {
sdkDriver: {}
});
loop.store.StoreMixin.register({
textChatStore: textChatStore
});
fakeContextURL = {
description: "An invalid page",
@ -422,16 +430,6 @@ describe("loop.roomViews", function () {
sinon.assert.calledOnce(dispatcher.dispatch);
sinon.assert.calledWithExactly(dispatcher.dispatch,
sinon.match.instanceOf(sharedActions.SetupStreamElements));
sinon.assert.calledWithExactly(dispatcher.dispatch,
sinon.match(function(value) {
return value.getLocalElementFunc() ===
view.getDOMNode().querySelector(".local");
}));
sinon.assert.calledWithExactly(dispatcher.dispatch,
sinon.match(function(value) {
return value.getRemoteElementFunc() ===
view.getDOMNode().querySelector(".remote");
}));
}
it("should dispatch a `SetupStreamElements` action when the MEDIA_WAIT state " +
@ -516,6 +514,54 @@ describe("loop.roomViews", function () {
TestUtils.findRenderedComponentWithType(view,
loop.shared.views.FeedbackView);
});
it("should display an avatar for remote video when the room has participants but video is not enabled",
function() {
activeRoomStore.setStoreState({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
mediaConnected: true,
remoteVideoEnabled: false
});
view = mountTestComponent();
TestUtils.findRenderedComponentWithType(view, sharedViews.AvatarView);
});
it("should display the remote video when there are participants and video is enabled", function() {
activeRoomStore.setStoreState({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
mediaConnected: true,
remoteVideoEnabled: true,
remoteSrcVideoObject: { fake: 1 }
});
view = mountTestComponent();
expect(view.getDOMNode().querySelector(".remote video")).not.eql(null);
});
it("should display an avatar for local video when the stream is muted", function() {
activeRoomStore.setStoreState({
videoMuted: true
});
view = mountTestComponent();
TestUtils.findRenderedComponentWithType(view, sharedViews.AvatarView);
});
it("should display the local video when the stream is enabled", function() {
activeRoomStore.setStoreState({
localSrcVideoObject: { fake: 1 },
videoMuted: false
});
view = mountTestComponent();
expect(view.getDOMNode().querySelector(".local video")).not.eql(null);
});
});
describe("Mute", function() {

View File

@ -102,7 +102,7 @@ class Test1BrowserCall(MarionetteTestCase):
media_container = self.wait_for_element_displayed(By.CLASS_NAME, "media")
self.assertEqual(media_container.tag_name, "div", "expect a video container")
self.check_video(".local .OT_publisher .OT_widget-container");
self.check_video(".local-video")
def local_get_and_verify_room_url(self):
self.switch_to_chatbox()
@ -127,23 +127,20 @@ class Test1BrowserCall(MarionetteTestCase):
"btn-join")
join_button.click()
# Assumes the standlone or the conversation window is selected first.
# Assumes the standalone or the conversation window is selected first.
def check_video(self, selector):
video_wrapper = self.wait_for_element_displayed(By.CSS_SELECTOR,
video = self.wait_for_element_displayed(By.CSS_SELECTOR,
selector, 20)
video = self.wait_for_subelement_displayed(video_wrapper,
By.TAG_NAME, "video")
self.wait_for_element_attribute_to_be_false(video, "paused")
self.assertEqual(video.get_attribute("ended"), "false")
def standalone_check_remote_video(self):
self.switch_to_standalone()
self.check_video(".remote .OT_subscriber .OT_widget-container")
self.check_video(".remote-video")
def local_check_remote_video(self):
self.switch_to_chatbox()
self.check_video(".remote .OT_subscriber .OT_widget-container")
self.check_video(".remote-video")
def local_enable_screenshare(self):
self.switch_to_chatbox()
@ -153,7 +150,7 @@ class Test1BrowserCall(MarionetteTestCase):
def standalone_check_remote_screenshare(self):
self.switch_to_standalone()
self.check_video(".media .screen .OT_subscriber .OT_widget-container")
self.check_video(".screen-share-video")
def remote_leave_room_and_verify_feedback(self):
self.switch_to_standalone()

View File

@ -971,6 +971,61 @@ describe("loop.store.ActiveRoomStore", function () {
});
});
describe("#localVideoEnabled", function() {
it("should add a localSrcVideoObject to the store", function() {
var fakeVideoElement = {name: "fakeVideoElement"};
expect(store.getStoreState()).to.not.have.property("localSrcVideoObject");
store.localVideoEnabled({srcVideoObject: fakeVideoElement});
expect(store.getStoreState()).to.have.property("localSrcVideoObject",
fakeVideoElement);
});
});
describe("#remoteVideoEnabled", function() {
var fakeVideoElement;
beforeEach(function() {
fakeVideoElement = {name: "fakeVideoElement"};
});
it("should add a remoteSrcVideoObject to the store", function() {
expect(store.getStoreState()).to.not.have.property("remoteSrcVideoObject");
store.remoteVideoEnabled({srcVideoObject: fakeVideoElement});
expect(store.getStoreState()).to.have.property("remoteSrcVideoObject",
fakeVideoElement);
});
it("should set remoteVideoEnabled", function() {
store.remoteVideoEnabled({srcVideoObject: fakeVideoElement});
expect(store.getStoreState().remoteVideoEnabled).eql(true);
});
});
describe("#remoteVideoDisabled", function() {
it("should set remoteVideoEnabled to false", function() {
store.setStoreState({
remoteVideoEnabled: true
});
store.remoteVideoDisabled();
expect(store.getStoreState().remoteVideoEnabled).eql(false);
});
});
describe("#mediaConnected", function() {
it("should set mediaConnected to true", function() {
store.mediaConnected();
expect(store.getStoreState().mediaConnected).eql(true);
});
});
describe("#screenSharingState", function() {
beforeEach(function() {
store.setStoreState({windowId: "1234"});
@ -1012,6 +1067,34 @@ describe("loop.store.ActiveRoomStore", function () {
expect(store.getStoreState().receivingScreenShare).eql(true);
});
it("should add a screenShareVideoObject to the store when sharing is active", function() {
var fakeVideoElement = {name: "fakeVideoElement"};
expect(store.getStoreState()).to.not.have.property("screenShareVideoObject");
store.receivingScreenShare(new sharedActions.ReceivingScreenShare({
receiving: true,
srcVideoObject: fakeVideoElement
}));
expect(store.getStoreState()).to.have.property("screenShareVideoObject",
fakeVideoElement);
});
it("should clear the screenShareVideoObject from the store when sharing is inactive", function() {
store.setStoreState({
screenShareVideoObject: {
name: "fakeVideoElement"
}
});
store.receivingScreenShare(new sharedActions.ReceivingScreenShare({
receiving: false,
srcVideoObject: null
}));
expect(store.getStoreState().screenShareVideoObject).eql(null);
});
it("should delete the screen remote video dimensions if screen sharing is not active", function() {
store.setStoreState({
remoteVideoDimensions: {
@ -1162,6 +1245,16 @@ describe("loop.store.ActiveRoomStore", function () {
expect(store.getStoreState().roomState).eql(ROOM_STATES.SESSION_CONNECTED);
});
it("should clear the remoteSrcVideoObject", function() {
store.setStoreState({
remoteSrcVideoObject: { name: "fakeVideoElement" }
});
store.remotePeerDisconnected();
expect(store.getStoreState().remoteSrcVideoObject).eql(null);
});
});
describe("#connectionStatus", function() {

View File

@ -13,7 +13,7 @@ describe("loop.store.ConversationStore", function () {
var sharedActions = loop.shared.actions;
var sharedUtils = loop.shared.utils;
var sandbox, dispatcher, client, store, fakeSessionData, sdkDriver;
var contact, fakeMozLoop;
var contact, fakeMozLoop, fakeVideoElement;
var connectPromise, resolveConnectPromise, rejectConnectPromise;
var wsCancelSpy, wsCloseSpy, wsDeclineSpy, wsMediaUpSpy, fakeWebsocket;
@ -89,6 +89,8 @@ describe("loop.store.ConversationStore", function () {
progressURL: "fakeURL"
};
fakeVideoElement = { id: "fakeVideoElement" };
var dummySocket = {
close: sinon.spy(),
send: sinon.spy()
@ -927,6 +929,62 @@ describe("loop.store.ConversationStore", function () {
sinon.assert.calledOnce(wsMediaUpSpy);
});
it("should set store.mediaConnected to true", function () {
store._websocket = fakeWebsocket;
store.mediaConnected(new sharedActions.MediaConnected());
expect(store.getStoreState("mediaConnected")).eql(true);
});
});
describe("#localVideoEnabled", function() {
it("should set store.localSrcVideoObject from the action data", function () {
store.localVideoEnabled(
new sharedActions.LocalVideoEnabled({srcVideoObject: fakeVideoElement}));
expect(store.getStoreState("localSrcVideoObject")).eql(fakeVideoElement);
});
});
describe("#remoteVideoEnabled", function() {
it("should set store.remoteSrcVideoObject from the actionData", function () {
store.setStoreState({remoteSrcVideoObject: undefined});
store.remoteVideoEnabled(
new sharedActions.RemoteVideoEnabled({srcVideoObject: fakeVideoElement}));
expect(store.getStoreState("remoteSrcVideoObject")).eql(fakeVideoElement);
});
it("should set store.remoteVideoEnabled to true", function () {
store.setStoreState({remoteVideoEnabled: false});
store.remoteVideoEnabled(
new sharedActions.RemoteVideoEnabled({srcVideoObject: fakeVideoElement}));
expect(store.getStoreState("remoteVideoEnabled")).to.be.true;
});
});
describe("#remoteVideoDisabled", function() {
it("should set store.remoteVideoEnabled to false", function () {
store.setStoreState({remoteVideoEnabled: true});
store.remoteVideoDisabled(new sharedActions.RemoteVideoDisabled({}));
expect(store.getStoreState("remoteVideoEnabled")).to.be.false;
});
it("should set store.remoteSrcVideoObject to undefined", function () {
store.setStoreState({remoteSrcVideoObject: fakeVideoElement});
store.remoteVideoDisabled(new sharedActions.RemoteVideoDisabled({}));
expect(store.getStoreState("remoteSrcVideoObject")).to.be.undefined;
});
});
describe("#setMute", function() {

View File

@ -368,88 +368,6 @@ describe("loop.shared.mixins", function() {
});
describe("Events", function() {
describe("resize", function() {
it("should update the width on the local stream element", function() {
localElement = {
offsetWidth: 100,
offsetHeight: 100,
style: { width: "0%" }
};
rootObject.events.resize();
sandbox.clock.tick(10);
expect(localElement.style.width).eql("100%");
});
it("should update the height on the remote stream element", function() {
remoteElement = {
offsetWidth: 100,
offsetHeight: 100,
style: { height: "0%" }
};
rootObject.events.resize();
sandbox.clock.tick(10);
expect(remoteElement.style.height).eql("100%");
});
it("should update the height on the screen share stream element", function() {
screenShareElement = {
offsetWidth: 100,
offsetHeight: 100,
style: { height: "0%" }
};
rootObject.events.resize();
sandbox.clock.tick(10);
expect(screenShareElement.style.height).eql("100%");
});
});
describe("orientationchange", function() {
it("should update the width on the local stream element", function() {
localElement = {
offsetWidth: 100,
offsetHeight: 100,
style: { width: "0%" }
};
rootObject.events.orientationchange();
sandbox.clock.tick(10);
expect(localElement.style.width).eql("100%");
});
it("should update the height on the remote stream element", function() {
remoteElement = {
offsetWidth: 100,
offsetHeight: 100,
style: { height: "0%" }
};
rootObject.events.orientationchange();
sandbox.clock.tick(10);
expect(remoteElement.style.height).eql("100%");
});
it("should update the height on the screen share stream element", function() {
screenShareElement = {
offsetWidth: 100,
offsetHeight: 100,
style: { height: "0%" }
};
rootObject.events.orientationchange();
sandbox.clock.tick(10);
expect(screenShareElement.style.height).eql("100%");
});
});
describe("Video stream dimensions", function() {
var localVideoDimensions = {

View File

@ -13,15 +13,11 @@ describe("loop.OTSdkDriver", function () {
var sandbox;
var dispatcher, driver, mozLoop, publisher, sdk, session, sessionData, subscriber;
var fakeLocalElement, fakeRemoteElement, fakeScreenElement;
var publisherConfig, fakeEvent;
beforeEach(function() {
sandbox = sinon.sandbox.create();
fakeLocalElement = { fake: 1 };
fakeRemoteElement = { fake: 2 };
fakeScreenElement = { fake: 3 };
fakeEvent = {
preventDefault: sinon.stub()
};
@ -120,8 +116,6 @@ describe("loop.OTSdkDriver", function () {
describe("#setupStreamElements", function() {
it("should call initPublisher", function() {
driver.setupStreamElements(new sharedActions.SetupStreamElements({
getLocalElementFunc: function() { return fakeLocalElement; },
getRemoteElementFunc: function() { return fakeRemoteElement; },
publisherConfig: publisherConfig
}));
@ -132,7 +126,9 @@ describe("loop.OTSdkDriver", function () {
}, publisherConfig);
sinon.assert.calledOnce(sdk.initPublisher);
sinon.assert.calledWith(sdk.initPublisher, fakeLocalElement, expectedConfig);
sinon.assert.calledWith(sdk.initPublisher,
sinon.match.instanceOf(HTMLDivElement),
expectedConfig);
});
});
@ -141,8 +137,6 @@ describe("loop.OTSdkDriver", function () {
sdk.initPublisher.returns(publisher);
driver.setupStreamElements(new sharedActions.SetupStreamElements({
getLocalElementFunc: function() { return fakeLocalElement; },
getRemoteElementFunc: function() { return fakeRemoteElement; },
publisherConfig: publisherConfig
}));
});
@ -169,7 +163,9 @@ describe("loop.OTSdkDriver", function () {
}, publisherConfig);
sinon.assert.calledTwice(sdk.initPublisher);
sinon.assert.calledWith(sdk.initPublisher, fakeLocalElement, expectedConfig);
sinon.assert.calledWith(sdk.initPublisher,
sinon.match.instanceOf(HTMLDivElement),
expectedConfig);
});
});
@ -178,8 +174,6 @@ describe("loop.OTSdkDriver", function () {
sdk.initPublisher.returns(publisher);
driver.setupStreamElements(new sharedActions.SetupStreamElements({
getLocalElementFunc: function() { return fakeLocalElement; },
getRemoteElementFunc: function() { return fakeRemoteElement; },
publisherConfig: publisherConfig
}));
});
@ -206,18 +200,8 @@ describe("loop.OTSdkDriver", function () {
});
describe("#startScreenShare", function() {
var fakeElement;
beforeEach(function() {
sandbox.stub(driver, "_noteSharingState");
fakeElement = {
className: "fakeVideo"
};
driver.getScreenShareElementFunc = function() {
return fakeElement;
};
});
it("should initialize a publisher", function() {
@ -233,7 +217,8 @@ describe("loop.OTSdkDriver", function () {
driver.startScreenShare(options);
sinon.assert.calledOnce(sdk.initPublisher);
sinon.assert.calledWithMatch(sdk.initPublisher, fakeElement, options);
sinon.assert.calledWithMatch(sdk.initPublisher,
sinon.match.instanceOf(HTMLDivElement), options);
});
it("should log a telemetry action", function() {
@ -259,10 +244,6 @@ describe("loop.OTSdkDriver", function () {
scrollWithPage: true
}
};
driver.getScreenShareElementFunc = function() {
return fakeScreenElement;
};
driver.startScreenShare(options);
});
@ -282,8 +263,6 @@ describe("loop.OTSdkDriver", function () {
describe("#endScreenShare", function() {
beforeEach(function() {
driver.getScreenShareElementFunc = function() {};
sandbox.stub(driver, "_noteSharingState");
});
@ -638,14 +617,34 @@ describe("loop.OTSdkDriver", function () {
});
});
describe("Events (general media)", function() {
describe("Events: general media", function() {
var fakeConnection, fakeStream, fakeSubscriberObject,
fakeSdkContainerWithVideo, videoElement;
beforeEach(function() {
fakeConnection = "fakeConnection";
fakeStream = {
hasVideo: true,
videoType: "camera",
videoDimensions: {width: 1, height: 2}
};
fakeSubscriberObject = _.extend({
session: { connection: fakeConnection },
stream: fakeStream
}, Backbone.Events);
fakeSdkContainerWithVideo = {
querySelector: sinon.stub().returns(videoElement)
};
// use a real video element so that these tests correctly reflect
// test behavior when run in firefox or chrome
videoElement = document.createElement("video");
driver.connectSession(sessionData);
driver.setupStreamElements(new sharedActions.SetupStreamElements({
getLocalElementFunc: function() {return fakeLocalElement; },
getScreenShareElementFunc: function() {return fakeScreenElement; },
getRemoteElementFunc: function() {return fakeRemoteElement; },
publisherConfig: publisherConfig
}));
});
@ -760,9 +759,13 @@ describe("loop.OTSdkDriver", function () {
});
describe("streamCreated (publisher/local)", function() {
var fakeStream;
var fakeStream, fakeMockVideo;
beforeEach(function() {
driver._mockPublisherEl = document.createElement("div");
fakeMockVideo = document.createElement("video");
driver._mockPublisherEl.appendChild(fakeMockVideo);
fakeStream = {
hasVideo: true,
videoType: "camera",
@ -782,6 +785,16 @@ describe("loop.OTSdkDriver", function () {
}));
});
it("should dispatch a LocalVideoEnabled action", function() {
publisher.trigger("streamCreated", { stream: fakeStream });
sinon.assert.called(dispatcher.dispatch);
sinon.assert.calledWithExactly(dispatcher.dispatch,
new sharedActions.LocalVideoEnabled({
srcVideoObject: fakeMockVideo
}));
});
it("should dispatch a ConnectionStatus action", function() {
driver._metrics.recvStreams = 1;
driver._metrics.connections = 2;
@ -800,16 +813,7 @@ describe("loop.OTSdkDriver", function () {
});
});
describe("streamCreated (session/remote)", function() {
var fakeStream;
beforeEach(function() {
fakeStream = {
hasVideo: true,
videoType: "camera",
videoDimensions: {width: 1, height: 2}
};
});
describe("streamCreated: session/remote", function() {
it("should dispatch a VideoDimensionsChanged action", function() {
session.trigger("streamCreated", { stream: fakeStream });
@ -843,28 +847,63 @@ describe("loop.OTSdkDriver", function () {
session.trigger("streamCreated", { stream: fakeStream });
sinon.assert.calledOnce(session.subscribe);
sinon.assert.calledWith(session.subscribe,
fakeStream, fakeRemoteElement, publisherConfig);
sinon.assert.calledWithExactly(session.subscribe,
fakeStream, sinon.match.instanceOf(HTMLDivElement), publisherConfig,
sinon.match.func);
});
it("should dispatch RemoteVideoEnabled if the stream has video" +
" after subscribe is complete", function() {
session.subscribe.yieldsOn(driver, null, fakeSubscriberObject,
videoElement).returns(this.fakeSubscriberObject);
driver.session = session;
fakeStream.connection = fakeConnection;
fakeStream.hasVideo = true;
session.trigger("streamCreated", { stream: fakeStream });
sinon.assert.called(dispatcher.dispatch);
sinon.assert.calledWithExactly(dispatcher.dispatch,
new sharedActions.RemoteVideoEnabled({
srcVideoObject: videoElement
}));
});
it("should not dispatch RemoteVideoEnabled if the stream is audio-only", function() {
session.subscribe.yieldsOn(driver, null, fakeSubscriberObject,
videoElement);
fakeStream.connection = fakeConnection;
fakeStream.hasVideo = false;
session.trigger("streamCreated", { stream: fakeStream });
sinon.assert.called(dispatcher.dispatch);
sinon.assert.neverCalledWith(dispatcher.dispatch,
new sharedActions.RemoteVideoEnabled({
srcVideoObject: videoElement
}));
});
it("should trigger a readyForDataChannel signal after subscribe is complete", function() {
session.subscribe.callsArgWith(3, null);
session.subscribe.yieldsOn(driver, null, fakeSubscriberObject,
document.createElement("video"));
driver._useDataChannels = true;
fakeStream.connection = "fakeID";
fakeStream.connection = fakeConnection;
session.trigger("streamCreated", { stream: fakeStream });
sinon.assert.calledOnce(session.signal);
sinon.assert.calledWith(session.signal, {
type: "readyForDataChannel",
to: "fakeID"
to: fakeConnection
});
});
it("should not trigger readyForDataChannel signal if data channels are not wanted", function() {
session.subscribe.callsArgWith(3, null);
session.subscribe.yieldsOn(driver, null, fakeSubscriberObject,
document.createElement("video"));
driver._useDataChannels = false;
fakeStream.connection = "fakeID";
fakeStream.connection = fakeConnection;
session.trigger("streamCreated", { stream: fakeStream });
@ -878,10 +917,13 @@ describe("loop.OTSdkDriver", function () {
sinon.assert.calledOnce(session.subscribe);
sinon.assert.calledWithExactly(session.subscribe,
fakeStream, fakeScreenElement, publisherConfig);
fakeStream, sinon.match.instanceOf(HTMLDivElement), publisherConfig,
sinon.match.func);
});
it("should dispatch a mediaConnected action if both streams are up", function() {
session.subscribe.yieldsOn(driver, null, fakeSubscriberObject,
videoElement);
driver._publishedLocalStream = true;
session.trigger("streamCreated", { stream: fakeStream });
@ -894,6 +936,8 @@ describe("loop.OTSdkDriver", function () {
it("should store the start time when both streams are up and" +
" driver._sendTwoWayMediaTelemetry is true", function() {
session.subscribe.yieldsOn(driver, null, fakeSubscriberObject,
videoElement);
driver._sendTwoWayMediaTelemetry = true;
driver._publishedLocalStream = true;
var startTime = 1;
@ -906,6 +950,8 @@ describe("loop.OTSdkDriver", function () {
it("should not store the start time when both streams are up and" +
" driver._isDesktop is false", function() {
session.subscribe.yieldsOn(driver, null, fakeSubscriberObject,
videoElement);
driver._isDesktop = false;
driver._publishedLocalStream = true;
var startTime = 73;
@ -936,8 +982,10 @@ describe("loop.OTSdkDriver", function () {
new sharedActions.ReceivingScreenShare({receiving: true}));
});
it("should dispatch a ReceivingScreenShare action for screen sharing streams",
function() {
// XXX See bug 1171933 and the comment in
// OtSdkDriver#_handleRemoteScreenShareCreated
it.skip("should dispatch a ReceivingScreenShare action for screen" +
" sharing streams", function() {
fakeStream.videoType = "screen";
session.trigger("streamCreated", { stream: fakeStream });
@ -949,7 +997,7 @@ describe("loop.OTSdkDriver", function () {
});
});
describe("streamDestroyed (publisher/local)", function() {
describe("streamDestroyed: publisher/local", function() {
it("should dispatch a ConnectionStatus action", function() {
driver._metrics.sendStreams = 1;
driver._metrics.recvStreams = 1;
@ -969,7 +1017,7 @@ describe("loop.OTSdkDriver", function () {
});
});
describe("streamDestroyed (session/remote)", function() {
describe("streamDestroyed: session/remote", function() {
var fakeStream;
beforeEach(function() {
@ -1182,6 +1230,36 @@ describe("loop.OTSdkDriver", function () {
});
});
describe("videoEnabled", function() {
it("should dispatch RemoteVideoEnabled", function() {
session.subscribe.yieldsOn(driver, null, fakeSubscriberObject,
videoElement).returns(this.fakeSubscriberObject);
session.trigger("streamCreated", {stream: fakeSubscriberObject.stream});
driver._mockSubscribeEl.appendChild(videoElement);
fakeSubscriberObject.trigger("videoEnabled");
sinon.assert.called(dispatcher.dispatch);
sinon.assert.calledWith(dispatcher.dispatch,
new sharedActions.RemoteVideoEnabled({srcVideoObject: videoElement}));
});
});
describe("videoDisabled", function() {
it("should dispatch RemoteVideoDisabled", function() {
session.subscribe.yieldsOn(driver, null, fakeSubscriberObject,
videoElement).returns(this.fakeSubscriberObject);
session.trigger("streamCreated", {stream: fakeSubscriberObject.stream});
fakeSubscriberObject.trigger("videoDisabled");
sinon.assert.called(dispatcher.dispatch);
sinon.assert.calledWithExactly(dispatcher.dispatch,
new sharedActions.RemoteVideoDisabled({}));
});
});
describe("signal:readyForDataChannel", function() {
beforeEach(function() {
driver.subscriber = subscriber;
@ -1270,15 +1348,19 @@ describe("loop.OTSdkDriver", function () {
});
});
describe("Events (screenshare)", function() {
describe("Events: screenshare:", function() {
var videoElement;
beforeEach(function() {
driver.connectSession(sessionData);
driver.getScreenShareElementFunc = function() {};
driver.startScreenShare({
videoSource: "window"
});
// use a real video element so that these tests correctly reflect
// code behavior when run in whatever browser
videoElement = document.createElement("video");
});
describe("accessAllowed", function() {

View File

@ -814,5 +814,125 @@ describe("loop.shared.views", function() {
});
});
});
});
describe("MediaView", function() {
var view;
function mountTestComponent(props) {
return TestUtils.renderIntoDocument(
React.createElement(sharedViews.MediaView, props));
}
it("should display an avatar view", function() {
view = mountTestComponent({
displayAvatar: true,
mediaType: "local"
});
TestUtils.findRenderedComponentWithType(view,
sharedViews.AvatarView);
});
it("should display a no-video div if no source object is supplied", function() {
view = mountTestComponent({
displayAvatar: false,
mediaType: "local"
});
var element = view.getDOMNode();
expect(element.className).eql("no-video");
});
it("should display a video element if a source object is supplied", function() {
view = mountTestComponent({
displayAvatar: false,
mediaType: "local",
// This doesn't actually get assigned to the video element, but is enough
// for this test to check display of the video element.
srcVideoObject: {
fake: 1
}
});
var element = view.getDOMNode();
expect(element).not.eql(null);
expect(element.className).eql("local-video");
expect(element.muted).eql(true);
});
// We test this function by itself, as otherwise we'd be into creating fake
// streams etc.
describe("#attachVideo", function() {
var fakeViewElement;
beforeEach(function() {
fakeViewElement = {
play: sinon.stub(),
tagName: "VIDEO"
};
view = mountTestComponent({
displayAvatar: false,
mediaType: "local",
srcVideoObject: {
fake: 1
}
});
});
it("should not throw if no source object is specified", function() {
expect(function() {
view.attachVideo(null);
}).to.not.Throw();
});
it("should not throw if the element is not a video object", function() {
sinon.stub(view, "getDOMNode").returns({
tagName: "DIV"
});
expect(function() {
view.attachVideo({});
}).to.not.Throw();
});
it("should attach a video object according to the standard", function() {
fakeViewElement.srcObject = null;
sinon.stub(view, "getDOMNode").returns(fakeViewElement);
view.attachVideo({
srcObject: {fake: 1}
});
expect(fakeViewElement.srcObject).eql({fake: 1});
});
it("should attach a video object for Firefox", function() {
fakeViewElement.mozSrcObject = null;
sinon.stub(view, "getDOMNode").returns(fakeViewElement);
view.attachVideo({
mozSrcObject: {fake: 2}
});
expect(fakeViewElement.mozSrcObject).eql({fake: 2});
});
it("should attach a video object for Chrome", function() {
fakeViewElement.src = null;
sinon.stub(view, "getDOMNode").returns(fakeViewElement);
view.attachVideo({
src: {fake: 2}
});
expect(fakeViewElement.src).eql({fake: 2});
});
});
});
});

View File

@ -178,24 +178,16 @@ describe("loop.standaloneRoomViews", function() {
return TestUtils.renderIntoDocument(
React.createElement(
loop.standaloneRoomViews.StandaloneRoomView, {
dispatcher: dispatcher,
activeRoomStore: activeRoomStore,
isFirefox: true
}));
dispatcher: dispatcher,
activeRoomStore: activeRoomStore,
isFirefox: true
}));
}
function expectActionDispatched(view) {
sinon.assert.calledOnce(dispatch);
sinon.assert.calledWithExactly(dispatch,
sinon.match.instanceOf(sharedActions.SetupStreamElements));
sinon.assert.calledWithExactly(dispatch, sinon.match(function(value) {
return value.getLocalElementFunc() ===
view.getDOMNode().querySelector(".local");
}));
sinon.assert.calledWithExactly(dispatch, sinon.match(function(value) {
return value.getRemoteElementFunc() ===
view.getDOMNode().querySelector(".remote");
}));
}
describe("#componentWillUpdate", function() {
@ -298,6 +290,10 @@ describe("loop.standaloneRoomViews", function() {
sandbox.stub(window, "matchMedia").returns({
matches: false
});
activeRoomStore.setStoreState({
remoteSrcVideoObject: {},
remoteVideoEnabled: true
});
view = mountTestComponent();
localElement = view._getElement(".local");
});
@ -317,6 +313,34 @@ describe("loop.standaloneRoomViews", function() {
expect(localElement.style.height).eql("120px");
});
it("should be a quarter of the width of the remote view element when there is no stream", function() {
activeRoomStore.setStoreState({
remoteSrcVideoObject: null,
remoteVideoEnabled: false
});
sandbox.stub(view, "getDOMNode").returns({
querySelector: function(selector) {
if (selector === ".local") {
return localElement;
}
return {
offsetWidth: 640,
offsetLeft: 0
};
}
});
view.updateLocalCameraPosition({
width: 1,
height: 0.75
});
expect(localElement.style.width).eql("160px");
expect(localElement.style.height).eql("120px");
});
it("should be a quarter of the width reduced for aspect ratio", function() {
sandbox.stub(view, "getRemoteVideoDimensions").returns({
streamWidth: 640,
@ -377,6 +401,34 @@ describe("loop.standaloneRoomViews", function() {
expect(localElement.style.left).eql("600px");
});
it("should position the stream to overlap the remote view element when there is no stream", function() {
activeRoomStore.setStoreState({
remoteSrcVideoObject: null,
remoteVideoEnabled: false
});
sandbox.stub(view, "getDOMNode").returns({
querySelector: function(selector) {
if (selector === ".local") {
return localElement;
}
return {
offsetWidth: 640,
offsetLeft: 0
};
}
});
view.updateLocalCameraPosition({
width: 1,
height: 0.75
});
expect(localElement.style.width).eql("160px");
expect(localElement.style.left).eql("600px");
});
it("should position the stream to overlap the main stream by a quarter when the aspect ratio is vertical", function() {
sandbox.stub(view, "getRemoteVideoDimensions").returns({
streamWidth: 640,
@ -576,6 +628,101 @@ describe("loop.standaloneRoomViews", function() {
});
});
describe("Participants", function() {
var videoElement;
beforeEach(function() {
videoElement = document.createElement("video");
});
it("should render local video when video_muted is false", function() {
activeRoomStore.setStoreState({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
localSrcVideoObject: videoElement,
videoMuted: false
});
expect(view.getDOMNode().querySelector(".local video")).not.eql(null);
});
it("should not render a local avatar when video_muted is false", function() {
activeRoomStore.setStoreState({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
videoMuted: false
});
expect(view.getDOMNode().querySelector(".local .avatar")).eql(null);
});
it("should render remote video when the room HAS_PARTICIPANTS and" +
" remoteVideoEnabled is true", function() {
activeRoomStore.setStoreState({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
remoteSrcVideoObject: videoElement,
remoteVideoEnabled: true
});
expect(view.getDOMNode().querySelector(".remote video")).not.eql(null);
});
it("should not render remote video when the room HAS_PARTICIPANTS," +
" remoteVideoEnabled is false, and mediaConnected is true", function() {
activeRoomStore.setStoreState({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
remoteSrcVideoObject: videoElement,
mediaConnected: true,
remoteVideoEnabled: false
});
expect(view.getDOMNode().querySelector(".remote video")).eql(null);
});
it("should render remote video when the room HAS_PARTICIPANTS," +
" and both remoteVideoEnabled and mediaConnected are false", function() {
activeRoomStore.setStoreState({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
remoteSrcVideoObject: videoElement,
mediaConnected: false,
remoteVideoEnabled: false
});
expect(view.getDOMNode().querySelector(".remote video")).not.eql(null);
});
it("should not render a remote avatar when the room is in MEDIA_WAIT", function() {
activeRoomStore.setStoreState({
roomState: ROOM_STATES.MEDIA_WAIT,
remoteSrcVideoObject: videoElement,
remoteVideoEnabled: false
});
expect(view.getDOMNode().querySelector(".remote .avatar")).eql(null);
});
it("should not render a remote avatar when the room is CLOSING and" +
" remoteVideoEnabled is false", function() {
activeRoomStore.setStoreState({
roomState: ROOM_STATES.CLOSING,
remoteSrcVideoObject: videoElement,
remoteVideoEnabled: false
});
expect(view.getDOMNode().querySelector(".remote .avatar")).eql(null);
});
it("should render a remote avatar when the room HAS_PARTICIPANTS, " +
"remoteVideoEnabled is false, and mediaConnected is true", function() {
activeRoomStore.setStoreState({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
remoteSrcVideoObject: videoElement,
remoteVideoEnabled: false,
mediaConnected: true
});
expect(view.getDOMNode().querySelector(".remote .avatar")).not.eql(null);
});
});
describe("Leave button", function() {
function getLeaveButton(view) {
return view.getDOMNode().querySelector(".btn-hangup");
@ -676,6 +823,18 @@ describe("loop.standaloneRoomViews", function() {
expect(view.getDOMNode().querySelector(".local-stream-audio"))
.not.eql(null);
});
it("should render a local avatar if the room HAS_PARTICIPANTS and" +
" .videoMuted is true",
function() {
activeRoomStore.setStoreState({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
videoMuted: true
});
expect(view.getDOMNode().querySelector(".local .avatar")).
not.eql(null);
});
});
describe("Marketplace hidden iframe", function() {

View File

@ -84,6 +84,7 @@
</script>
<script src="../content/js/panel.js"></script>
<script src="../content/js/conversation.js"></script>
<script src="react-frame-component.js"></script>
<script src="ui-showcase.js"></script>
</body>
</html>

View File

@ -0,0 +1,130 @@
/*
* Copied from <https://github.com/ryanseddon/react-frame-component> 0.3.2,
* by Ryan Seddon, under the MIT license, since that original version requires
* a browserify-style loader.
*/
/**
* This is an array of frames that are queued and waiting to be loaded before
* their rendering is completed.
*
* @type {Array}
*/
window.queuedFrames = [];
/**
* Renders this.props.children inside an <iframe>.
*
* Works by creating the iframe, waiting for that to finish, and then
* rendering the children inside that. Waits for a while in the hopes that the
* contents will have been rendered, and then fires a callback indicating that.
*
* @see onContentsRendered for the gory details about this.
*
* @type {ReactComponentFactory<P>}
*/
window.Frame = React.createClass({
propTypes: {
style: React.PropTypes.object,
head: React.PropTypes.node,
width: React.PropTypes.number,
height: React.PropTypes.number,
onContentsRendered: React.PropTypes.func
},
render: function() {
return React.createElement("iframe", {
style: this.props.style,
head: this.props.head,
width: this.props.width,
height: this.props.height
});
},
componentDidMount: function() {
this.renderFrameContents();
},
renderFrameContents: function() {
var doc = this.getDOMNode().contentDocument;
if (doc && doc.readyState === "complete") {
// Remove this from the queue.
window.queuedFrames.splice(window.queuedFrames.indexOf(this), 1);
var iframeHead = doc.querySelector("head");
var parentHeadChildren = document.querySelector("head").children;
[].forEach.call(parentHeadChildren, function(parentHeadNode) {
iframeHead.appendChild(parentHeadNode.cloneNode(true));
});
var contents = React.createElement("div",
undefined,
this.props.head,
this.props.children
);
React.render(contents, doc.body, this.fireOnContentsRendered.bind(this));
} else {
// Queue it, only if it isn't already. We do need to set the timeout
// regardless, as this function can get re-entered several times.
if (window.queuedFrames.indexOf(this) === -1) {
window.queuedFrames.push(this);
}
setTimeout(this.renderFrameContents.bind(this), 0);
}
},
/**
* Fires the onContentsRendered callback passed in via this.props,
* with the first argument set to the window global used by the iframe.
* This is useful in extracting things specific to that iframe (such as
* the matchMedia function) for use by code running in that iframe. Once
* React gets a more complete "context" feature:
*
* https://facebook.github.io/react/blog/2015/02/24/streamlining-react-elements.html#solution-make-context-parent-based-instead-of-owner-based
*
* we should be able to avoid reaching into the DOM like this.
*
* XXX wait a little while. After React has rendered this iframe (eg the
* virtual DOM cache gets flushed to the browser), there's still more stuff
* that needs to happen before layout completes. If onContentsRendered fires
* before that happens, the wrong sizes (eg remote stream vertical height
* of 0) are used to compute the position in the MediaSetupStream, resulting
* in everything looking wonky. One high likelihood candidate for the delay
* here involves loading/decode poster images, but even using link
* rel=prefetch on those isn't enough to workaround this problem, so there
* may be more.
*
* There doesn't appear to be a good cross-browser way to handle this
* at the moment without gross violation of encapsulation (see
* http://stackoverflow.com/questions/27241186/how-to-determine-when-document-has-loaded-after-loading-external-csshttp://stackoverflow.com/questions/27241186/how-to-determine-when-document-has-loaded-after-loading-external-css
* for discussion of a related problem.
*
* For now, just wait for multiple seconds. Yuck.
*/
fireOnContentsRendered: function() {
if (!this.props.onContentsRendered) {
return;
}
var contentWindow;
try {
contentWindow = this.getDOMNode().contentWindow;
if (!contentWindow) {
throw new Error("no content window returned");
}
} catch (ex) {
console.error("exception getting content window", ex);
}
// Using bind to construct a "partial function", where |this| is unchanged,
// but the first parameter is guaranteed to be set. Details at
// https://developer.mozilla.org/de/docs/Web/JavaScript/Reference/Global_Objects/Function/bind#Example.3A_Partial_Functions
setTimeout(this.props.onContentsRendered.bind(undefined, contentWindow),
3000);
},
componentDidUpdate: function() {
this.renderFrameContents();
},
componentWillUnmount: function() {
React.unmountComponentAtNode(React.findDOMNode(this).contentDocument.body);
}
});

Binary file not shown.

Before

Width:  |  Height:  |  Size: 100 KiB

After

Width:  |  Height:  |  Size: 402 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 2.0 MiB

After

Width:  |  Height:  |  Size: 536 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 263 KiB

View File

@ -3,7 +3,9 @@
* file, You can obtain one at http://mozilla.org/MPL/2.0/. */
body {
/* Override the hidden in common.css */
/* Override the hidden in common.css. Very important otherwise you can't
* scroll the showcase.
*/
overflow: visible;
}
@ -107,72 +109,12 @@ body {
font-weight: bold;
}
/*
* Switched to using height: 100% in standalone version
* this mocks it for the ui so that the component has height
* */
.standalone .video-layout-wrapper,
.standalone .remote_wrapper {
min-height: 550px;
}
@media screen and (max-width:640px) {
.standalone .local-stream {
background-size: cover;
}
.standalone .local-stream,
.conversation .media.nested .remote {
background-size: cover;
background-position: center;
}
.standalone .remote_wrapper {
width: 100%;
background-size: cover;
background-position: center;
}
}
.remote_wrapper {
background-image: url("sample-img/video-screen-remote.png");
background-repeat: no-repeat;
background-size: cover;
}
.local-stream {
background-image: url("sample-img/video-screen-local.png");
background-repeat: no-repeat;
}
.local-stream.local:not(.local-stream-audio) {
background-size: cover;
}
.call-action-group .btn-group-chevron,
.call-action-group .btn-group {
/* Prevent box overflow due to long string */
max-width: 120px;
}
.conversation .media.nested .remote {
/* Height of obsolute box covers media control buttons. UI showcase only.
* When tokbox inserts the markup into the page the problem goes away */
bottom: auto;
}
.standalone .ended-conversation .remote_wrapper,
.standalone .video-layout-wrapper {
/* Removes the fake video image for ended conversations */
background: none;
}
/* Rooms edge cases */
.standalone .room-conversation .remote_wrapper {
background: none;
}
/* SVG icons showcase */
.svg-icons h3 {

View File

@ -2,7 +2,7 @@
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at http://mozilla.org/MPL/2.0/. */
/* global uncaughtError:true */
/* global Frame:false uncaughtError:true */
(function() {
"use strict";
@ -18,6 +18,7 @@
// 1.2. Conversation Window
var AcceptCallView = loop.conversationViews.AcceptCallView;
var DesktopPendingConversationView = loop.conversationViews.PendingConversationView;
var OngoingConversationView = loop.conversationViews.OngoingConversationView;
var CallFailedView = loop.conversationViews.CallFailedView;
var DesktopRoomConversationView = loop.roomViews.DesktopRoomConversationView;
@ -25,17 +26,10 @@
var HomeView = loop.webapp.HomeView;
var UnsupportedBrowserView = loop.webapp.UnsupportedBrowserView;
var UnsupportedDeviceView = loop.webapp.UnsupportedDeviceView;
var CallUrlExpiredView = loop.webapp.CallUrlExpiredView;
var GumPromptConversationView = loop.webapp.GumPromptConversationView;
var WaitingConversationView = loop.webapp.WaitingConversationView;
var StartConversationView = loop.webapp.StartConversationView;
var FailedConversationView = loop.webapp.FailedConversationView;
var EndedConversationView = loop.webapp.EndedConversationView;
var StandaloneRoomView = loop.standaloneRoomViews.StandaloneRoomView;
// 3. Shared components
var ConversationToolbar = loop.shared.views.ConversationToolbar;
var ConversationView = loop.shared.views.ConversationView;
var FeedbackView = loop.shared.views.FeedbackView;
// Store constants
@ -94,13 +88,154 @@
}
}, Backbone.Events);
var activeRoomStore = new loop.store.ActiveRoomStore(dispatcher, {
mozLoop: navigator.mozLoop,
sdkDriver: mockSDK
/**
* Every view that uses an activeRoomStore needs its own; if they shared
* an active store, they'd interfere with each other.
*
* @param options
* @returns {loop.store.ActiveRoomStore}
*/
function makeActiveRoomStore(options) {
var dispatcher = new loop.Dispatcher();
var store = new loop.store.ActiveRoomStore(dispatcher, {
mozLoop: navigator.mozLoop,
sdkDriver: mockSDK
});
if (!("remoteVideoEnabled" in options)) {
options.remoteVideoEnabled = true;
}
if (!("mediaConnected" in options)) {
options.mediaConnected = true;
}
store.setStoreState({
mediaConnected: options.mediaConnected,
remoteVideoEnabled: options.remoteVideoEnabled,
roomName: "A Very Long Conversation Name",
roomState: options.roomState,
used: !!options.roomUsed,
videoMuted: !!options.videoMuted
});
store.forcedUpdate = function forcedUpdate(contentWindow) {
// Since this is called by setTimeout, we don't want to lose any
// exceptions if there's a problem and we need to debug, so...
try {
// the dimensions here are taken from the poster images that we're
// using, since they give the <video> elements their initial intrinsic
// size. This ensures that the right aspect ratios are calculated.
// These are forced to 640x480, because it makes it visually easy to
// validate that the showcase looks like the real app on a chine
// (eg MacBook Pro) where that is the default camera resolution.
var newStoreState = {
localVideoDimensions: {
camera: {height: 480, orientation: 0, width: 640}
},
mediaConnected: options.mediaConnected,
receivingScreenShare: !!options.receivingScreenShare,
remoteVideoDimensions: {
camera: {height: 480, orientation: 0, width: 640}
},
remoteVideoEnabled: options.remoteVideoEnabled,
matchMedia: contentWindow.matchMedia.bind(contentWindow),
roomState: options.roomState,
videoMuted: !!options.videoMuted
};
if (options.receivingScreenShare) {
// Note that the image we're using had to be scaled a bit, and
// it still ended up a bit narrower than the live thing that
// WebRTC sends; presumably a different scaling algorithm.
// For showcase purposes, this shouldn't matter much, as the sizes
// of things being shared will be fairly arbitrary.
newStoreState.remoteVideoDimensions.screen =
{height: 456, orientation: 0, width: 641};
}
store.setStoreState(newStoreState);
} catch (ex) {
console.error("exception in forcedUpdate:", ex);
}
};
return store;
}
var activeRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.HAS_PARTICIPANTS
});
var joinedRoomStore = makeActiveRoomStore({
mediaConnected: false,
roomState: ROOM_STATES.JOINED,
remoteVideoEnabled: false
});
var readyRoomStore = makeActiveRoomStore({
mediaConnected: false,
roomState: ROOM_STATES.READY
});
var updatingActiveRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.HAS_PARTICIPANTS
});
var localFaceMuteRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
videoMuted: true
});
var remoteFaceMuteRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
remoteVideoEnabled: false,
mediaConnected: true
});
var updatingSharingRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
receivingScreenShare: true
});
var fullActiveRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.FULL
});
var failedRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.FAILED
});
var endedRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.ENDED,
roomUsed: true
});
var roomStore = new loop.store.RoomStore(dispatcher, {
mozLoop: navigator.mozLoop
});
var desktopLocalFaceMuteActiveRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
videoMuted: true
});
var desktopLocalFaceMuteRoomStore = new loop.store.RoomStore(dispatcher, {
mozLoop: navigator.mozLoop,
activeRoomStore: desktopLocalFaceMuteActiveRoomStore
});
var desktopRemoteFaceMuteActiveRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
remoteVideoEnabled: false,
mediaConnected: true
});
var desktopRemoteFaceMuteRoomStore = new loop.store.RoomStore(dispatcher, {
mozLoop: navigator.mozLoop,
activeRoomStore: desktopRemoteFaceMuteActiveRoomStore
});
var feedbackStore = new loop.store.FeedbackStore(dispatcher, {
feedbackClient: stageFeedbackApiClient
});
@ -216,6 +351,37 @@
}
});
var FramedExample = React.createClass({displayName: "FramedExample",
propTypes: {
width: React.PropTypes.number,
height: React.PropTypes.number,
onContentsRendered: React.PropTypes.func
},
makeId: function(prefix) {
return (prefix || "") + this.props.summary.toLowerCase().replace(/\s/g, "-");
},
render: function() {
var cx = React.addons.classSet;
return (
React.createElement("div", {className: "example"},
React.createElement("h3", {id: this.makeId()},
this.props.summary,
React.createElement("a", {href: this.makeId("#")}, " ¶")
),
React.createElement("div", {className: cx({comp: true, dashed: this.props.dashed}),
style: this.props.style},
React.createElement(Frame, {width: this.props.width, height: this.props.height,
onContentsRendered: this.props.onContentsRendered},
this.props.children
)
)
)
);
}
});
var Example = React.createClass({displayName: "Example",
makeId: function(prefix) {
return (prefix || "") + this.props.summary.toLowerCase().replace(/\s/g, "-");
@ -272,6 +438,7 @@
});
var App = React.createClass({displayName: "App",
render: function() {
return (
React.createElement(ShowCase, null,
@ -364,19 +531,19 @@
React.createElement(Section, {name: "ConversationToolbar"},
React.createElement("h2", null, "Desktop Conversation Window"),
React.createElement("div", {className: "fx-embedded override-position"},
React.createElement(Example, {summary: "Default", dashed: "true", style: {width: "300px", height: "272px"}},
React.createElement(Example, {summary: "Default", style: {width: "300px", height: "26px"}},
React.createElement(ConversationToolbar, {video: {enabled: true},
audio: {enabled: true},
hangup: noop,
publishStream: noop})
),
React.createElement(Example, {summary: "Video muted", style: {width: "300px", height: "272px"}},
React.createElement(Example, {summary: "Video muted", style: {width: "300px", height: "26px"}},
React.createElement(ConversationToolbar, {video: {enabled: false},
audio: {enabled: true},
hangup: noop,
publishStream: noop})
),
React.createElement(Example, {summary: "Audio muted", style: {width: "300px", height: "272px"}},
React.createElement(Example, {summary: "Audio muted", style: {width: "300px", height: "26px"}},
React.createElement(ConversationToolbar, {video: {enabled: true},
audio: {enabled: false},
hangup: noop,
@ -407,30 +574,6 @@
)
),
React.createElement(Section, {name: "GumPromptConversationView"},
React.createElement(Example, {summary: "Gum Prompt conversation view", dashed: "true"},
React.createElement("div", {className: "standalone"},
React.createElement(GumPromptConversationView, null)
)
)
),
React.createElement(Section, {name: "WaitingConversationView"},
React.createElement(Example, {summary: "Waiting conversation view (connecting)", dashed: "true"},
React.createElement("div", {className: "standalone"},
React.createElement(WaitingConversationView, {websocket: mockWebSocket,
dispatcher: dispatcher})
)
),
React.createElement(Example, {summary: "Waiting conversation view (ringing)", dashed: "true"},
React.createElement("div", {className: "standalone"},
React.createElement(WaitingConversationView, {websocket: mockWebSocket,
dispatcher: dispatcher,
callState: "ringing"})
)
)
),
React.createElement(Section, {name: "PendingConversationView (Desktop)"},
React.createElement(Example, {summary: "Connecting", dashed: "true",
style: {width: "300px", height: "272px"}},
@ -469,94 +612,61 @@
)
),
React.createElement(Section, {name: "StartConversationView"},
React.createElement(Example, {summary: "Start conversation view", dashed: "true"},
React.createElement("div", {className: "standalone"},
React.createElement(StartConversationView, {conversation: mockConversationModel,
client: mockClient,
notifications: notifications})
)
)
),
React.createElement(Section, {name: "FailedConversationView"},
React.createElement(Example, {summary: "Failed conversation view", dashed: "true"},
React.createElement("div", {className: "standalone"},
React.createElement(FailedConversationView, {conversation: mockConversationModel,
client: mockClient,
notifications: notifications})
)
)
),
React.createElement(Section, {name: "ConversationView"},
React.createElement(Example, {summary: "Desktop conversation window", dashed: "true",
style: {width: "300px", height: "272px"}},
React.createElement(Section, {name: "OngoingConversationView"},
React.createElement(FramedExample, {width: 298, height: 254,
summary: "Desktop ongoing conversation window"},
React.createElement("div", {className: "fx-embedded"},
React.createElement(ConversationView, {sdk: mockSDK,
model: mockConversationModel,
video: {enabled: true},
audio: {enabled: true}})
React.createElement(OngoingConversationView, {
dispatcher: dispatcher,
video: {enabled: true},
audio: {enabled: true},
localPosterUrl: "sample-img/video-screen-local.png",
remotePosterUrl: "sample-img/video-screen-remote.png",
remoteVideoEnabled: true,
mediaConnected: true})
)
),
React.createElement(Example, {summary: "Desktop conversation window large", dashed: "true"},
React.createElement("div", {className: "breakpoint", "data-breakpoint-width": "800px",
"data-breakpoint-height": "600px"},
React.createElement(FramedExample, {width: 800, height: 600,
summary: "Desktop ongoing conversation window large"},
React.createElement("div", {className: "fx-embedded"},
React.createElement(ConversationView, {sdk: mockSDK,
React.createElement(OngoingConversationView, {
dispatcher: dispatcher,
video: {enabled: true},
audio: {enabled: true},
model: mockConversationModel})
localPosterUrl: "sample-img/video-screen-local.png",
remotePosterUrl: "sample-img/video-screen-remote.png",
remoteVideoEnabled: true,
mediaConnected: true})
)
)
),
React.createElement(Example, {summary: "Desktop conversation window local audio stream",
dashed: "true", style: {width: "300px", height: "272px"}},
React.createElement(FramedExample, {width: 298, height: 254,
summary: "Desktop ongoing conversation window - local face mute"},
React.createElement("div", {className: "fx-embedded"},
React.createElement(ConversationView, {sdk: mockSDK,
video: {enabled: false},
audio: {enabled: true},
model: mockConversationModel})
React.createElement(OngoingConversationView, {
dispatcher: dispatcher,
video: {enabled: false},
audio: {enabled: true},
remoteVideoEnabled: true,
remotePosterUrl: "sample-img/video-screen-remote.png",
mediaConnected: true})
)
),
React.createElement(Example, {summary: "Standalone version"},
React.createElement("div", {className: "standalone"},
React.createElement(ConversationView, {sdk: mockSDK,
video: {enabled: true},
audio: {enabled: true},
model: mockConversationModel})
React.createElement(FramedExample, {width: 298, height: 254,
summary: "Desktop ongoing conversation window - remote face mute"},
React.createElement("div", {className: "fx-embedded"},
React.createElement(OngoingConversationView, {
dispatcher: dispatcher,
video: {enabled: true},
audio: {enabled: true},
remoteVideoEnabled: false,
localPosterUrl: "sample-img/video-screen-local.png",
mediaConnected: true})
)
)
),
React.createElement(Section, {name: "ConversationView-640"},
React.createElement(Example, {summary: "640px breakpoint for conversation view"},
React.createElement("div", {className: "breakpoint",
style: {"text-align": "center"},
"data-breakpoint-width": "400px",
"data-breakpoint-height": "780px"},
React.createElement("div", {className: "standalone"},
React.createElement(ConversationView, {sdk: mockSDK,
video: {enabled: true},
audio: {enabled: true},
model: mockConversationModel})
)
)
)
),
React.createElement(Section, {name: "ConversationView-LocalAudio"},
React.createElement(Example, {summary: "Local stream is audio only"},
React.createElement("div", {className: "standalone"},
React.createElement(ConversationView, {sdk: mockSDK,
video: {enabled: false},
audio: {enabled: true},
model: mockConversationModel})
)
)
),
React.createElement(Section, {name: "FeedbackView"},
@ -575,28 +685,6 @@
)
),
React.createElement(Section, {name: "CallUrlExpiredView"},
React.createElement(Example, {summary: "Firefox User"},
React.createElement(CallUrlExpiredView, {isFirefox: true})
),
React.createElement(Example, {summary: "Non-Firefox User"},
React.createElement(CallUrlExpiredView, {isFirefox: false})
)
),
React.createElement(Section, {name: "EndedConversationView"},
React.createElement(Example, {summary: "Displays the feedback form"},
React.createElement("div", {className: "standalone"},
React.createElement(EndedConversationView, {sdk: mockSDK,
video: {enabled: true},
audio: {enabled: true},
conversation: mockConversationModel,
feedbackStore: feedbackStore,
onAfterFeedbackReceived: noop})
)
)
),
React.createElement(Section, {name: "AlertMessages"},
React.createElement(Example, {summary: "Various alerts"},
React.createElement("div", {className: "alert alert-warning"},
@ -615,15 +703,6 @@
)
),
React.createElement(Section, {name: "HomeView"},
React.createElement(Example, {summary: "Standalone Home View"},
React.createElement("div", {className: "standalone"},
React.createElement(HomeView, null)
)
)
),
React.createElement(Section, {name: "UnsupportedBrowserView"},
React.createElement(Example, {summary: "Standalone Unsupported Browser"},
React.createElement("div", {className: "standalone"},
@ -641,97 +720,171 @@
),
React.createElement(Section, {name: "DesktopRoomConversationView"},
React.createElement(Example, {summary: "Desktop room conversation (invitation)", dashed: "true",
style: {width: "260px", height: "265px"}},
React.createElement(FramedExample, {width: 298, height: 254,
summary: "Desktop room conversation (invitation)"},
React.createElement("div", {className: "fx-embedded"},
React.createElement(DesktopRoomConversationView, {
roomStore: roomStore,
dispatcher: dispatcher,
mozLoop: navigator.mozLoop,
localPosterUrl: "sample-img/video-screen-local.png",
roomState: ROOM_STATES.INIT})
)
),
React.createElement(Example, {summary: "Desktop room conversation", dashed: "true",
style: {width: "260px", height: "265px"}},
React.createElement(FramedExample, {width: 298, height: 254,
summary: "Desktop room conversation"},
React.createElement("div", {className: "fx-embedded"},
React.createElement(DesktopRoomConversationView, {
roomStore: roomStore,
dispatcher: dispatcher,
mozLoop: navigator.mozLoop,
localPosterUrl: "sample-img/video-screen-local.png",
remotePosterUrl: "sample-img/video-screen-remote.png",
roomState: ROOM_STATES.HAS_PARTICIPANTS})
)
),
React.createElement(FramedExample, {width: 298, height: 254,
summary: "Desktop room conversation local face-mute"},
React.createElement("div", {className: "fx-embedded"},
React.createElement(DesktopRoomConversationView, {
roomStore: desktopLocalFaceMuteRoomStore,
dispatcher: dispatcher,
mozLoop: navigator.mozLoop,
remotePosterUrl: "sample-img/video-screen-remote.png"})
)
),
React.createElement(FramedExample, {width: 298, height: 254,
summary: "Desktop room conversation remote face-mute"},
React.createElement("div", {className: "fx-embedded"},
React.createElement(DesktopRoomConversationView, {
roomStore: desktopRemoteFaceMuteRoomStore,
dispatcher: dispatcher,
mozLoop: navigator.mozLoop,
localPosterUrl: "sample-img/video-screen-local.png"})
)
)
),
React.createElement(Section, {name: "StandaloneRoomView"},
React.createElement(Example, {summary: "Standalone room conversation (ready)"},
React.createElement(FramedExample, {width: 644, height: 483,
summary: "Standalone room conversation (ready)"},
React.createElement("div", {className: "standalone"},
React.createElement(StandaloneRoomView, {
dispatcher: dispatcher,
activeRoomStore: activeRoomStore,
activeRoomStore: readyRoomStore,
roomState: ROOM_STATES.READY,
isFirefox: true})
)
),
React.createElement(Example, {summary: "Standalone room conversation (joined)"},
React.createElement(FramedExample, {width: 644, height: 483,
summary: "Standalone room conversation (joined)",
onContentsRendered: joinedRoomStore.forcedUpdate},
React.createElement("div", {className: "standalone"},
React.createElement(StandaloneRoomView, {
dispatcher: dispatcher,
activeRoomStore: activeRoomStore,
roomState: ROOM_STATES.JOINED,
activeRoomStore: joinedRoomStore,
localPosterUrl: "sample-img/video-screen-local.png",
isFirefox: true})
)
),
React.createElement(Example, {summary: "Standalone room conversation (has-participants)"},
React.createElement(FramedExample, {width: 644, height: 483,
onContentsRendered: updatingActiveRoomStore.forcedUpdate,
summary: "Standalone room conversation (has-participants, 644x483)"},
React.createElement("div", {className: "standalone"},
React.createElement(StandaloneRoomView, {
dispatcher: dispatcher,
activeRoomStore: updatingActiveRoomStore,
roomState: ROOM_STATES.HAS_PARTICIPANTS,
isFirefox: true,
localPosterUrl: "sample-img/video-screen-local.png",
remotePosterUrl: "sample-img/video-screen-remote.png"})
)
),
React.createElement(FramedExample, {width: 644, height: 483,
onContentsRendered: localFaceMuteRoomStore.forcedUpdate,
summary: "Standalone room conversation (local face mute, has-participants, 644x483)"},
React.createElement("div", {className: "standalone"},
React.createElement(StandaloneRoomView, {
dispatcher: dispatcher,
activeRoomStore: activeRoomStore,
roomState: ROOM_STATES.HAS_PARTICIPANTS,
activeRoomStore: localFaceMuteRoomStore,
isFirefox: true,
localPosterUrl: "sample-img/video-screen-local.png",
remotePosterUrl: "sample-img/video-screen-remote.png"})
)
),
React.createElement(FramedExample, {width: 644, height: 483,
onContentsRendered: remoteFaceMuteRoomStore.forcedUpdate,
summary: "Standalone room conversation (remote face mute, has-participants, 644x483)"},
React.createElement("div", {className: "standalone"},
React.createElement(StandaloneRoomView, {
dispatcher: dispatcher,
activeRoomStore: remoteFaceMuteRoomStore,
isFirefox: true,
localPosterUrl: "sample-img/video-screen-local.png",
remotePosterUrl: "sample-img/video-screen-remote.png"})
)
),
React.createElement(FramedExample, {width: 800, height: 660,
onContentsRendered: updatingSharingRoomStore.forcedUpdate,
summary: "Standalone room convo (has-participants, receivingScreenShare, 800x660)"},
React.createElement("div", {className: "standalone"},
React.createElement(StandaloneRoomView, {
dispatcher: dispatcher,
activeRoomStore: updatingSharingRoomStore,
roomState: ROOM_STATES.HAS_PARTICIPANTS,
isFirefox: true,
localPosterUrl: "sample-img/video-screen-local.png",
remotePosterUrl: "sample-img/video-screen-remote.png",
screenSharePosterUrl: "sample-img/video-screen-terminal.png"}
)
)
),
React.createElement(FramedExample, {width: 644, height: 483,
summary: "Standalone room conversation (full - FFx user)"},
React.createElement("div", {className: "standalone"},
React.createElement(StandaloneRoomView, {
dispatcher: dispatcher,
activeRoomStore: fullActiveRoomStore,
isFirefox: true})
)
),
React.createElement(Example, {summary: "Standalone room conversation (full - FFx user)"},
React.createElement(FramedExample, {width: 644, height: 483,
summary: "Standalone room conversation (full - non FFx user)"},
React.createElement("div", {className: "standalone"},
React.createElement(StandaloneRoomView, {
dispatcher: dispatcher,
activeRoomStore: activeRoomStore,
roomState: ROOM_STATES.FULL,
isFirefox: true})
)
),
React.createElement(Example, {summary: "Standalone room conversation (full - non FFx user)"},
React.createElement("div", {className: "standalone"},
React.createElement(StandaloneRoomView, {
dispatcher: dispatcher,
activeRoomStore: activeRoomStore,
roomState: ROOM_STATES.FULL,
activeRoomStore: fullActiveRoomStore,
isFirefox: false})
)
),
React.createElement(Example, {summary: "Standalone room conversation (feedback)"},
React.createElement(FramedExample, {width: 644, height: 483,
summary: "Standalone room conversation (feedback)"},
React.createElement("div", {className: "standalone"},
React.createElement(StandaloneRoomView, {
dispatcher: dispatcher,
activeRoomStore: activeRoomStore,
activeRoomStore: endedRoomStore,
feedbackStore: feedbackStore,
roomState: ROOM_STATES.ENDED,
isFirefox: false})
)
),
React.createElement(Example, {summary: "Standalone room conversation (failed)"},
React.createElement(FramedExample, {width: 644, height: 483,
summary: "Standalone room conversation (failed)"},
React.createElement("div", {className: "standalone"},
React.createElement(StandaloneRoomView, {
dispatcher: dispatcher,
activeRoomStore: activeRoomStore,
roomState: ROOM_STATES.FAILED,
activeRoomStore: failedRoomStore,
isFirefox: false})
)
)
@ -754,45 +907,6 @@
}
});
/**
* Render components that have different styles across
* CSS media rules in their own iframe to mimic the viewport
* */
function _renderComponentsInIframes() {
var parents = document.querySelectorAll(".breakpoint");
[].forEach.call(parents, appendChildInIframe);
/**
* Extracts the component from the DOM and appends in the an iframe
*
* @type {HTMLElement} parent - Parent DOM node of a component & iframe
* */
function appendChildInIframe(parent) {
var styles = document.querySelector("head").children;
var component = parent.children[0];
var iframe = document.createElement("iframe");
var width = parent.dataset.breakpointWidth;
var height = parent.dataset.breakpointHeight;
iframe.style.width = width;
iframe.style.height = height;
parent.appendChild(iframe);
iframe.src = "about:blank";
// Workaround for bug 297685
iframe.onload = function () {
var iframeHead = iframe.contentDocument.querySelector("head");
iframe.contentDocument.documentElement.querySelector("body")
.appendChild(component);
[].forEach.call(styles, function(style) {
iframeHead.appendChild(style.cloneNode(true));
});
};
}
}
window.addEventListener("DOMContentLoaded", function() {
try {
React.renderComponent(React.createElement(App, null), document.getElementById("main"));
@ -805,23 +919,28 @@
uncaughtError = err;
}
_renderComponentsInIframes();
// Wait until all the FramedExamples have been fully loaded.
setTimeout(function waitForQueuedFrames() {
if (window.queuedFrames.length != 0) {
setTimeout(waitForQueuedFrames, 500);
return;
}
// Put the title back, in case views changed it.
document.title = "Loop UI Components Showcase";
// Put the title back, in case views changed it.
document.title = "Loop UI Components Showcase";
// This simulates the mocha layout for errors which means we can run
// this alongside our other unit tests but use the same harness.
if (uncaughtError) {
$("#results").append("<div class='failures'><em>1</em></div>");
$("#results").append("<li class='test fail'>" +
"<h2>Errors rendering UI-Showcase</h2>" +
"<pre class='error'>" + uncaughtError + "\n" + uncaughtError.stack + "</pre>" +
"</li>");
} else {
$("#results").append("<div class='failures'><em>0</em></div>");
}
$("#results").append("<p id='complete'>Complete.</p>");
// This simulates the mocha layout for errors which means we can run
// this alongside our other unit tests but use the same harness.
if (uncaughtError) {
$("#results").append("<div class='failures'><em>1</em></div>");
$("#results").append("<li class='test fail'>" +
"<h2>Errors rendering UI-Showcase</h2>" +
"<pre class='error'>" + uncaughtError + "\n" + uncaughtError.stack + "</pre>" +
"</li>");
} else {
$("#results").append("<div class='failures'><em>0</em></div>");
}
$("#results").append("<p id='complete'>Complete.</p>");
}, 1000);
});
})();

View File

@ -2,7 +2,7 @@
* License, v. 2.0. If a copy of the MPL was not distributed with this
* file, You can obtain one at http://mozilla.org/MPL/2.0/. */
/* global uncaughtError:true */
/* global Frame:false uncaughtError:true */
(function() {
"use strict";
@ -18,6 +18,7 @@
// 1.2. Conversation Window
var AcceptCallView = loop.conversationViews.AcceptCallView;
var DesktopPendingConversationView = loop.conversationViews.PendingConversationView;
var OngoingConversationView = loop.conversationViews.OngoingConversationView;
var CallFailedView = loop.conversationViews.CallFailedView;
var DesktopRoomConversationView = loop.roomViews.DesktopRoomConversationView;
@ -25,17 +26,10 @@
var HomeView = loop.webapp.HomeView;
var UnsupportedBrowserView = loop.webapp.UnsupportedBrowserView;
var UnsupportedDeviceView = loop.webapp.UnsupportedDeviceView;
var CallUrlExpiredView = loop.webapp.CallUrlExpiredView;
var GumPromptConversationView = loop.webapp.GumPromptConversationView;
var WaitingConversationView = loop.webapp.WaitingConversationView;
var StartConversationView = loop.webapp.StartConversationView;
var FailedConversationView = loop.webapp.FailedConversationView;
var EndedConversationView = loop.webapp.EndedConversationView;
var StandaloneRoomView = loop.standaloneRoomViews.StandaloneRoomView;
// 3. Shared components
var ConversationToolbar = loop.shared.views.ConversationToolbar;
var ConversationView = loop.shared.views.ConversationView;
var FeedbackView = loop.shared.views.FeedbackView;
// Store constants
@ -94,13 +88,154 @@
}
}, Backbone.Events);
var activeRoomStore = new loop.store.ActiveRoomStore(dispatcher, {
mozLoop: navigator.mozLoop,
sdkDriver: mockSDK
/**
* Every view that uses an activeRoomStore needs its own; if they shared
* an active store, they'd interfere with each other.
*
* @param options
* @returns {loop.store.ActiveRoomStore}
*/
function makeActiveRoomStore(options) {
var dispatcher = new loop.Dispatcher();
var store = new loop.store.ActiveRoomStore(dispatcher, {
mozLoop: navigator.mozLoop,
sdkDriver: mockSDK
});
if (!("remoteVideoEnabled" in options)) {
options.remoteVideoEnabled = true;
}
if (!("mediaConnected" in options)) {
options.mediaConnected = true;
}
store.setStoreState({
mediaConnected: options.mediaConnected,
remoteVideoEnabled: options.remoteVideoEnabled,
roomName: "A Very Long Conversation Name",
roomState: options.roomState,
used: !!options.roomUsed,
videoMuted: !!options.videoMuted
});
store.forcedUpdate = function forcedUpdate(contentWindow) {
// Since this is called by setTimeout, we don't want to lose any
// exceptions if there's a problem and we need to debug, so...
try {
// the dimensions here are taken from the poster images that we're
// using, since they give the <video> elements their initial intrinsic
// size. This ensures that the right aspect ratios are calculated.
// These are forced to 640x480, because it makes it visually easy to
// validate that the showcase looks like the real app on a chine
// (eg MacBook Pro) where that is the default camera resolution.
var newStoreState = {
localVideoDimensions: {
camera: {height: 480, orientation: 0, width: 640}
},
mediaConnected: options.mediaConnected,
receivingScreenShare: !!options.receivingScreenShare,
remoteVideoDimensions: {
camera: {height: 480, orientation: 0, width: 640}
},
remoteVideoEnabled: options.remoteVideoEnabled,
matchMedia: contentWindow.matchMedia.bind(contentWindow),
roomState: options.roomState,
videoMuted: !!options.videoMuted
};
if (options.receivingScreenShare) {
// Note that the image we're using had to be scaled a bit, and
// it still ended up a bit narrower than the live thing that
// WebRTC sends; presumably a different scaling algorithm.
// For showcase purposes, this shouldn't matter much, as the sizes
// of things being shared will be fairly arbitrary.
newStoreState.remoteVideoDimensions.screen =
{height: 456, orientation: 0, width: 641};
}
store.setStoreState(newStoreState);
} catch (ex) {
console.error("exception in forcedUpdate:", ex);
}
};
return store;
}
var activeRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.HAS_PARTICIPANTS
});
var joinedRoomStore = makeActiveRoomStore({
mediaConnected: false,
roomState: ROOM_STATES.JOINED,
remoteVideoEnabled: false
});
var readyRoomStore = makeActiveRoomStore({
mediaConnected: false,
roomState: ROOM_STATES.READY
});
var updatingActiveRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.HAS_PARTICIPANTS
});
var localFaceMuteRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
videoMuted: true
});
var remoteFaceMuteRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
remoteVideoEnabled: false,
mediaConnected: true
});
var updatingSharingRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
receivingScreenShare: true
});
var fullActiveRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.FULL
});
var failedRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.FAILED
});
var endedRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.ENDED,
roomUsed: true
});
var roomStore = new loop.store.RoomStore(dispatcher, {
mozLoop: navigator.mozLoop
});
var desktopLocalFaceMuteActiveRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
videoMuted: true
});
var desktopLocalFaceMuteRoomStore = new loop.store.RoomStore(dispatcher, {
mozLoop: navigator.mozLoop,
activeRoomStore: desktopLocalFaceMuteActiveRoomStore
});
var desktopRemoteFaceMuteActiveRoomStore = makeActiveRoomStore({
roomState: ROOM_STATES.HAS_PARTICIPANTS,
remoteVideoEnabled: false,
mediaConnected: true
});
var desktopRemoteFaceMuteRoomStore = new loop.store.RoomStore(dispatcher, {
mozLoop: navigator.mozLoop,
activeRoomStore: desktopRemoteFaceMuteActiveRoomStore
});
var feedbackStore = new loop.store.FeedbackStore(dispatcher, {
feedbackClient: stageFeedbackApiClient
});
@ -216,6 +351,37 @@
}
});
var FramedExample = React.createClass({
propTypes: {
width: React.PropTypes.number,
height: React.PropTypes.number,
onContentsRendered: React.PropTypes.func
},
makeId: function(prefix) {
return (prefix || "") + this.props.summary.toLowerCase().replace(/\s/g, "-");
},
render: function() {
var cx = React.addons.classSet;
return (
<div className="example">
<h3 id={this.makeId()}>
{this.props.summary}
<a href={this.makeId("#")}>&nbsp;</a>
</h3>
<div className={cx({comp: true, dashed: this.props.dashed})}
style={this.props.style}>
<Frame width={this.props.width} height={this.props.height}
onContentsRendered={this.props.onContentsRendered}>
{this.props.children}
</Frame>
</div>
</div>
);
}
});
var Example = React.createClass({
makeId: function(prefix) {
return (prefix || "") + this.props.summary.toLowerCase().replace(/\s/g, "-");
@ -272,6 +438,7 @@
});
var App = React.createClass({
render: function() {
return (
<ShowCase>
@ -364,19 +531,19 @@
<Section name="ConversationToolbar">
<h2>Desktop Conversation Window</h2>
<div className="fx-embedded override-position">
<Example summary="Default" dashed="true" style={{width: "300px", height: "272px"}}>
<Example summary="Default" style={{width: "300px", height: "26px"}}>
<ConversationToolbar video={{enabled: true}}
audio={{enabled: true}}
hangup={noop}
publishStream={noop} />
</Example>
<Example summary="Video muted" style={{width: "300px", height: "272px"}}>
<Example summary="Video muted" style={{width: "300px", height: "26px"}}>
<ConversationToolbar video={{enabled: false}}
audio={{enabled: true}}
hangup={noop}
publishStream={noop} />
</Example>
<Example summary="Audio muted" style={{width: "300px", height: "272px"}}>
<Example summary="Audio muted" style={{width: "300px", height: "26px"}}>
<ConversationToolbar video={{enabled: true}}
audio={{enabled: false}}
hangup={noop}
@ -407,30 +574,6 @@
</div>
</Section>
<Section name="GumPromptConversationView">
<Example summary="Gum Prompt conversation view" dashed="true">
<div className="standalone">
<GumPromptConversationView />
</div>
</Example>
</Section>
<Section name="WaitingConversationView">
<Example summary="Waiting conversation view (connecting)" dashed="true">
<div className="standalone">
<WaitingConversationView websocket={mockWebSocket}
dispatcher={dispatcher} />
</div>
</Example>
<Example summary="Waiting conversation view (ringing)" dashed="true">
<div className="standalone">
<WaitingConversationView websocket={mockWebSocket}
dispatcher={dispatcher}
callState="ringing"/>
</div>
</Example>
</Section>
<Section name="PendingConversationView (Desktop)">
<Example summary="Connecting" dashed="true"
style={{width: "300px", height: "272px"}}>
@ -469,94 +612,61 @@
</Example>
</Section>
<Section name="StartConversationView">
<Example summary="Start conversation view" dashed="true">
<div className="standalone">
<StartConversationView conversation={mockConversationModel}
client={mockClient}
notifications={notifications} />
</div>
</Example>
</Section>
<Section name="FailedConversationView">
<Example summary="Failed conversation view" dashed="true">
<div className="standalone">
<FailedConversationView conversation={mockConversationModel}
client={mockClient}
notifications={notifications} />
</div>
</Example>
</Section>
<Section name="ConversationView">
<Example summary="Desktop conversation window" dashed="true"
style={{width: "300px", height: "272px"}}>
<Section name="OngoingConversationView">
<FramedExample width={298} height={254}
summary="Desktop ongoing conversation window">
<div className="fx-embedded">
<ConversationView sdk={mockSDK}
model={mockConversationModel}
video={{enabled: true}}
audio={{enabled: true}} />
<OngoingConversationView
dispatcher={dispatcher}
video={{enabled: true}}
audio={{enabled: true}}
localPosterUrl="sample-img/video-screen-local.png"
remotePosterUrl="sample-img/video-screen-remote.png"
remoteVideoEnabled={true}
mediaConnected={true} />
</div>
</Example>
</FramedExample>
<Example summary="Desktop conversation window large" dashed="true">
<div className="breakpoint" data-breakpoint-width="800px"
data-breakpoint-height="600px">
<FramedExample width={800} height={600}
summary="Desktop ongoing conversation window large">
<div className="fx-embedded">
<ConversationView sdk={mockSDK}
<OngoingConversationView
dispatcher={dispatcher}
video={{enabled: true}}
audio={{enabled: true}}
model={mockConversationModel} />
localPosterUrl="sample-img/video-screen-local.png"
remotePosterUrl="sample-img/video-screen-remote.png"
remoteVideoEnabled={true}
mediaConnected={true} />
</div>
</div>
</Example>
</FramedExample>
<Example summary="Desktop conversation window local audio stream"
dashed="true" style={{width: "300px", height: "272px"}}>
<FramedExample width={298} height={254}
summary="Desktop ongoing conversation window - local face mute">
<div className="fx-embedded">
<ConversationView sdk={mockSDK}
video={{enabled: false}}
audio={{enabled: true}}
model={mockConversationModel} />
<OngoingConversationView
dispatcher={dispatcher}
video={{enabled: false}}
audio={{enabled: true}}
remoteVideoEnabled={true}
remotePosterUrl="sample-img/video-screen-remote.png"
mediaConnected={true} />
</div>
</Example>
</FramedExample>
<Example summary="Standalone version">
<div className="standalone">
<ConversationView sdk={mockSDK}
video={{enabled: true}}
audio={{enabled: true}}
model={mockConversationModel} />
<FramedExample width={298} height={254}
summary="Desktop ongoing conversation window - remote face mute">
<div className="fx-embedded">
<OngoingConversationView
dispatcher={dispatcher}
video={{enabled: true}}
audio={{enabled: true}}
remoteVideoEnabled={false}
localPosterUrl="sample-img/video-screen-local.png"
mediaConnected={true} />
</div>
</Example>
</Section>
</FramedExample>
<Section name="ConversationView-640">
<Example summary="640px breakpoint for conversation view">
<div className="breakpoint"
style={{"text-align": "center"}}
data-breakpoint-width="400px"
data-breakpoint-height="780px">
<div className="standalone">
<ConversationView sdk={mockSDK}
video={{enabled: true}}
audio={{enabled: true}}
model={mockConversationModel} />
</div>
</div>
</Example>
</Section>
<Section name="ConversationView-LocalAudio">
<Example summary="Local stream is audio only">
<div className="standalone">
<ConversationView sdk={mockSDK}
video={{enabled: false}}
audio={{enabled: true}}
model={mockConversationModel} />
</div>
</Example>
</Section>
<Section name="FeedbackView">
@ -575,28 +685,6 @@
</Example>
</Section>
<Section name="CallUrlExpiredView">
<Example summary="Firefox User">
<CallUrlExpiredView isFirefox={true} />
</Example>
<Example summary="Non-Firefox User">
<CallUrlExpiredView isFirefox={false} />
</Example>
</Section>
<Section name="EndedConversationView">
<Example summary="Displays the feedback form">
<div className="standalone">
<EndedConversationView sdk={mockSDK}
video={{enabled: true}}
audio={{enabled: true}}
conversation={mockConversationModel}
feedbackStore={feedbackStore}
onAfterFeedbackReceived={noop} />
</div>
</Example>
</Section>
<Section name="AlertMessages">
<Example summary="Various alerts">
<div className="alert alert-warning">
@ -615,15 +703,6 @@
</Example>
</Section>
<Section name="HomeView">
<Example summary="Standalone Home View">
<div className="standalone">
<HomeView />
</div>
</Example>
</Section>
<Section name="UnsupportedBrowserView">
<Example summary="Standalone Unsupported Browser">
<div className="standalone">
@ -641,100 +720,174 @@
</Section>
<Section name="DesktopRoomConversationView">
<Example summary="Desktop room conversation (invitation)" dashed="true"
style={{width: "260px", height: "265px"}}>
<FramedExample width={298} height={254}
summary="Desktop room conversation (invitation)">
<div className="fx-embedded">
<DesktopRoomConversationView
roomStore={roomStore}
dispatcher={dispatcher}
mozLoop={navigator.mozLoop}
localPosterUrl="sample-img/video-screen-local.png"
roomState={ROOM_STATES.INIT} />
</div>
</Example>
</FramedExample>
<Example summary="Desktop room conversation" dashed="true"
style={{width: "260px", height: "265px"}}>
<FramedExample width={298} height={254}
summary="Desktop room conversation">
<div className="fx-embedded">
<DesktopRoomConversationView
roomStore={roomStore}
dispatcher={dispatcher}
mozLoop={navigator.mozLoop}
localPosterUrl="sample-img/video-screen-local.png"
remotePosterUrl="sample-img/video-screen-remote.png"
roomState={ROOM_STATES.HAS_PARTICIPANTS} />
</div>
</Example>
</FramedExample>
<FramedExample width={298} height={254}
summary="Desktop room conversation local face-mute">
<div className="fx-embedded">
<DesktopRoomConversationView
roomStore={desktopLocalFaceMuteRoomStore}
dispatcher={dispatcher}
mozLoop={navigator.mozLoop}
remotePosterUrl="sample-img/video-screen-remote.png" />
</div>
</FramedExample>
<FramedExample width={298} height={254}
summary="Desktop room conversation remote face-mute">
<div className="fx-embedded">
<DesktopRoomConversationView
roomStore={desktopRemoteFaceMuteRoomStore}
dispatcher={dispatcher}
mozLoop={navigator.mozLoop}
localPosterUrl="sample-img/video-screen-local.png" />
</div>
</FramedExample>
</Section>
<Section name="StandaloneRoomView">
<Example summary="Standalone room conversation (ready)">
<FramedExample width={644} height={483}
summary="Standalone room conversation (ready)">
<div className="standalone">
<StandaloneRoomView
dispatcher={dispatcher}
activeRoomStore={activeRoomStore}
activeRoomStore={readyRoomStore}
roomState={ROOM_STATES.READY}
isFirefox={true} />
</div>
</Example>
</FramedExample>
<Example summary="Standalone room conversation (joined)">
<FramedExample width={644} height={483}
summary="Standalone room conversation (joined)"
onContentsRendered={joinedRoomStore.forcedUpdate}>
<div className="standalone">
<StandaloneRoomView
dispatcher={dispatcher}
activeRoomStore={activeRoomStore}
roomState={ROOM_STATES.JOINED}
activeRoomStore={joinedRoomStore}
localPosterUrl="sample-img/video-screen-local.png"
isFirefox={true} />
</div>
</Example>
</FramedExample>
<Example summary="Standalone room conversation (has-participants)">
<FramedExample width={644} height={483}
onContentsRendered={updatingActiveRoomStore.forcedUpdate}
summary="Standalone room conversation (has-participants, 644x483)">
<div className="standalone">
<StandaloneRoomView
dispatcher={dispatcher}
activeRoomStore={updatingActiveRoomStore}
roomState={ROOM_STATES.HAS_PARTICIPANTS}
isFirefox={true}
localPosterUrl="sample-img/video-screen-local.png"
remotePosterUrl="sample-img/video-screen-remote.png" />
</div>
</FramedExample>
<FramedExample width={644} height={483}
onContentsRendered={localFaceMuteRoomStore.forcedUpdate}
summary="Standalone room conversation (local face mute, has-participants, 644x483)">
<div className="standalone">
<StandaloneRoomView
dispatcher={dispatcher}
activeRoomStore={activeRoomStore}
roomState={ROOM_STATES.HAS_PARTICIPANTS}
activeRoomStore={localFaceMuteRoomStore}
isFirefox={true}
localPosterUrl="sample-img/video-screen-local.png"
remotePosterUrl="sample-img/video-screen-remote.png" />
</div>
</FramedExample>
<FramedExample width={644} height={483}
onContentsRendered={remoteFaceMuteRoomStore.forcedUpdate}
summary="Standalone room conversation (remote face mute, has-participants, 644x483)">
<div className="standalone">
<StandaloneRoomView
dispatcher={dispatcher}
activeRoomStore={remoteFaceMuteRoomStore}
isFirefox={true}
localPosterUrl="sample-img/video-screen-local.png"
remotePosterUrl="sample-img/video-screen-remote.png" />
</div>
</FramedExample>
<FramedExample width={800} height={660}
onContentsRendered={updatingSharingRoomStore.forcedUpdate}
summary="Standalone room convo (has-participants, receivingScreenShare, 800x660)">
<div className="standalone">
<StandaloneRoomView
dispatcher={dispatcher}
activeRoomStore={updatingSharingRoomStore}
roomState={ROOM_STATES.HAS_PARTICIPANTS}
isFirefox={true}
localPosterUrl="sample-img/video-screen-local.png"
remotePosterUrl="sample-img/video-screen-remote.png"
screenSharePosterUrl="sample-img/video-screen-terminal.png"
/>
</div>
</FramedExample>
<FramedExample width={644} height={483}
summary="Standalone room conversation (full - FFx user)">
<div className="standalone">
<StandaloneRoomView
dispatcher={dispatcher}
activeRoomStore={fullActiveRoomStore}
isFirefox={true} />
</div>
</Example>
</FramedExample>
<Example summary="Standalone room conversation (full - FFx user)">
<FramedExample width={644} height={483}
summary="Standalone room conversation (full - non FFx user)">
<div className="standalone">
<StandaloneRoomView
dispatcher={dispatcher}
activeRoomStore={activeRoomStore}
roomState={ROOM_STATES.FULL}
isFirefox={true} />
</div>
</Example>
<Example summary="Standalone room conversation (full - non FFx user)">
<div className="standalone">
<StandaloneRoomView
dispatcher={dispatcher}
activeRoomStore={activeRoomStore}
roomState={ROOM_STATES.FULL}
activeRoomStore={fullActiveRoomStore}
isFirefox={false} />
</div>
</Example>
</FramedExample>
<Example summary="Standalone room conversation (feedback)">
<FramedExample width={644} height={483}
summary="Standalone room conversation (feedback)">
<div className="standalone">
<StandaloneRoomView
dispatcher={dispatcher}
activeRoomStore={activeRoomStore}
activeRoomStore={endedRoomStore}
feedbackStore={feedbackStore}
roomState={ROOM_STATES.ENDED}
isFirefox={false} />
</div>
</Example>
</FramedExample>
<Example summary="Standalone room conversation (failed)">
<FramedExample width={644} height={483}
summary="Standalone room conversation (failed)">
<div className="standalone">
<StandaloneRoomView
dispatcher={dispatcher}
activeRoomStore={activeRoomStore}
roomState={ROOM_STATES.FAILED}
activeRoomStore={failedRoomStore}
isFirefox={false} />
</div>
</Example>
</FramedExample>
</Section>
<Section name="SVG icons preview" className="svg-icons">
@ -754,45 +907,6 @@
}
});
/**
* Render components that have different styles across
* CSS media rules in their own iframe to mimic the viewport
* */
function _renderComponentsInIframes() {
var parents = document.querySelectorAll(".breakpoint");
[].forEach.call(parents, appendChildInIframe);
/**
* Extracts the component from the DOM and appends in the an iframe
*
* @type {HTMLElement} parent - Parent DOM node of a component & iframe
* */
function appendChildInIframe(parent) {
var styles = document.querySelector("head").children;
var component = parent.children[0];
var iframe = document.createElement("iframe");
var width = parent.dataset.breakpointWidth;
var height = parent.dataset.breakpointHeight;
iframe.style.width = width;
iframe.style.height = height;
parent.appendChild(iframe);
iframe.src = "about:blank";
// Workaround for bug 297685
iframe.onload = function () {
var iframeHead = iframe.contentDocument.querySelector("head");
iframe.contentDocument.documentElement.querySelector("body")
.appendChild(component);
[].forEach.call(styles, function(style) {
iframeHead.appendChild(style.cloneNode(true));
});
};
}
}
window.addEventListener("DOMContentLoaded", function() {
try {
React.renderComponent(<App />, document.getElementById("main"));
@ -805,23 +919,28 @@
uncaughtError = err;
}
_renderComponentsInIframes();
// Wait until all the FramedExamples have been fully loaded.
setTimeout(function waitForQueuedFrames() {
if (window.queuedFrames.length != 0) {
setTimeout(waitForQueuedFrames, 500);
return;
}
// Put the title back, in case views changed it.
document.title = "Loop UI Components Showcase";
// Put the title back, in case views changed it.
document.title = "Loop UI Components Showcase";
// This simulates the mocha layout for errors which means we can run
// this alongside our other unit tests but use the same harness.
if (uncaughtError) {
$("#results").append("<div class='failures'><em>1</em></div>");
$("#results").append("<li class='test fail'>" +
"<h2>Errors rendering UI-Showcase</h2>" +
"<pre class='error'>" + uncaughtError + "\n" + uncaughtError.stack + "</pre>" +
"</li>");
} else {
$("#results").append("<div class='failures'><em>0</em></div>");
}
$("#results").append("<p id='complete'>Complete.</p>");
// This simulates the mocha layout for errors which means we can run
// this alongside our other unit tests but use the same harness.
if (uncaughtError) {
$("#results").append("<div class='failures'><em>1</em></div>");
$("#results").append("<li class='test fail'>" +
"<h2>Errors rendering UI-Showcase</h2>" +
"<pre class='error'>" + uncaughtError + "\n" + uncaughtError.stack + "</pre>" +
"</li>");
} else {
$("#results").append("<div class='failures'><em>0</em></div>");
}
$("#results").append("<p id='complete'>Complete.</p>");
}, 1000);
});
})();