[Year 12 SofDev] SD 2012 post mortem

Mark KELLY kel at mckinnonsc.vic.edu.au
Tue Nov 20 10:39:08 EST 2012


Hi Heath. My responses are inline, below...

On 19 November 2012 22:11, Matheson, Heath A <
Matheson.Heath.A at edumail.vic.gov.au> wrote:

>  Thanks  once again for the post mortem Mark, my first way of viewing
> this year’s SD exam. ****
>
> ** I have a few thoughts :
>
> ** MultiChoice Q4 – the ethical one. What if I said “Miss Programmer, I
> want you to make me a program that will give me all e-mails of my employees
> containing the word snowie or sickie.”? Using the ‘What would the VCAA want
> me to say?’ strategy I’d go with D.
>

I read the "what a program will do" as a functional requirement decision,
not an ethical "is it right to make a program do this?" issue. But I can
see your point.

I thought with pay and conditions, a code of ethics the issue could be the
management (legally) abusing work contracts to overwork programmers etc.

Looking at the question more deeply, it does not provide enough information
about what "an issue" could be.  We have both come up with issues that
satisfy options C and D.  So logically they should both be acceptable to
the markers...  Hmmm.

I'm very tempted to write to the chief examiner about this one.


> **
>
> I didn’t like B Q3: I like the prototype answer although I think using the
> word “completeness” rather than “quality” in the question would have been
> better if this is the case. Rapid Action Development pops into my head,
> which I guess is repeated prototyping, but I thought that was long gone. I
> would answer ‘when the client doesn’t really know what the finished product
> should look like and the solution requirements are likely to change along
> the way’ but that’s certainly not something I would teach.
>

Yes. B3 was an oddity. I still can't work out what key knowledge was being
examined, unless it was SD U3O2 Key skill #3... "design prototype solutions
that take into account the needs of users."  ??


>  I like the pseudocode questions.
>
A breath of fresh air, I thought... and about time, too.

> **
>
> I think CS Q1 should have asked for a functional requirement that a phone
> alone cannot provide first then asked for an additional important
> functional requirement****
>
> In CS Q3b I think 1 could also be record symptom data and 2 is collate
> asthma episode data which includes the phone id (shown in the DFD).
>
Yes, it could be. The DFD does not exactly state where the phone ID is
fetched, so one might have to assume it is during the collation process
(since it is not sent to the collation process in any other data flow.)
 I'm now leaning towards your interpretation, Heath.


> ****
>
> CS Q7 I think using the term “queue” would be what they are looking for
> with justification being FIFO meaning it would always contain the last 24
> hours of readings.
>
This has got me thinking again.

Data structures in the study design are:

- U3O2 KK03 - types of data structures, including one-dimensional arrays, *
records* and *files*
- U4O1 KK10 - forms and uses of data structures to organise and manipulate
data, including two-dimensional arrays, stacks and *queues*

So, it could be a file or record.  But a queue... ?

The DFD shows that pollutant data are stored in the 'accumulated air
quality data' store *with the time+date*.
This means the samples would not have to be queued to keep them in
chronological order: they could be sorted and processed later using the
attached timecode.

So a record including the 4 pollutant data readings, the time and date
would suffice.
The queue would work, but would not be necessary.

So, I think the markers should accept file, record or queue as long as
there was a relevant justification.
Thanks the the food for thought, Heath.  You're thinking along the same
lines as Kevork on these points!


> ****
>
> Regarding Acceptance Testing, the term acceptance is only used adjacent to
> user in the study design which adds a bit of weight to the “asthmatics”
> answer.
>
Yes, I see your point, but "acceptance testing" (AT) and "user acceptance
testing" (UAT) are different things (which is why they have different
names) and if the exam did mean to refer to UAT it should have said so,
even if the study design's only reference to AT  did tar it with the user
brush (so to speak).

That sentence just committed suicide.  I apologise.

So, the study design's mention of "acceptance testing" is actually
"acceptance testing by users." But if they meant to refer to UAT in the
exam, they should have said that. But they didn't. They referred to
"acceptance testing".

Quoting the first couple of  Google hits for "acceptance testing", I found:

http://www.wisegeek.com/what-is-acceptance-testing.htm

*Acceptance testing is a final stage of testing that is performed on a
system prior to the system being delivered to a live environment. Systems
subjected to acceptance testing might include such deliverables as a
software system or a mechanical hardware system. Acceptance tests are
generally performed as "black box" tests. Black box testing means that the
tester uses specified inputs into the system and verifies that the
resulting outputs are correct, without knowledge of the system's internal
workings.*
*
*
*User acceptance testing (UAT), is the term used when the acceptance tests
are performed by the person or persons who will be using the live system
once it is delivered. If the system is being built or developed by an
external supplier, this is sometimes called customer acceptance testing
(CAT). The UAT or CAT acts as a final confirmation that the system is ready
for go-live. A successful acceptance test at this stage may be a
contractual requirement prior to the system being signed off by the client.*

http://en.wikipedia.org/wiki/Acceptance_testing

*In engineering and its various subdisciplines, acceptance testing is a
test conducted to determine if the requirements of a specification or
contract are met. It may involve chemical tests, physical tests, or
performance tests.*
*In systems engineering it may involve black-box testing performed on a
system (for example: a piece of software, lots of manufactured mechanical
parts, or batches of chemical products) prior to its delivery.[1]
*
*Software developers often distinguish acceptance testing by the system
provider from acceptance testing by the customer (the user or client) prior
to accepting transfer of ownership. In the case of software, acceptance
testing performed by the customer is known as user acceptance testing
(UAT), end-user testing, site (acceptance) testing, or field (acceptance)
testing.*

(This muddies the waters, but still equates UAT with end-user testing)

So, I reckon that if an exam refers to a common industry term like AT, they
should abide by the common industry interpretation of the term. If they
meant UAT they should have said that.

And I reckon that if an exam tests students' knowledge of terms, the terms
must be defined in the study design's glossary to make it clear which of
several possible interpretations it accepts.  The IT industry often has
variations of terminology.

<gump>And that's all I got to say about that.</gump>


 ****
>
> This would be the second time in the case study that you can repeat an
> answer for parts a and b of a question (question 1)
>
 Except to add that in 14b it asks about "Is the mobile phone software
solution easy to use?", which seems to be fishing for UAT, suggesting 14a
is acceptance testing in which case there is no repetition.

> I do like this paper, something there for everyone to have a go at.
>
> Please let me know folks if I’m completely wrong with these thoughts J****
>
> **
>
Thanks for the input, Heath. It was really useful!


>  Looking forward to getting back to SD next year, I prefer it to ITA.****
>
> Happy year 11 exam marking!****
>
> ** **
>
> Heath Matheson****
>
> Mount Beauty****
>
> ** **
>
> ** **
>
> *From:* sofdev-bounces at edulists.com.au [mailto:
> sofdev-bounces at edulists.com.au] *On Behalf Of *Mark KELLY
> *Sent:* Monday, 19 November 2012 1:25 PM
> *To:* Year 12 Software Development Teachers' Mailing List
> *Subject:* [Year 12 SofDev] SD 2012 post mortem****
>
> ** **
>
> Hi all. The discussion of the SD paper has been very quiet!  Does that
> mean everyone is pleased with it?****
>
> ** **
>
> Overall, I think it was a good paper.  Well balanced, challenging and
> thought-provoking without being unclear.****
>
> ** **
>
> http://www.vceit.com/postmortems/2012sd/SD2012exam.htm
> ****
>
> ** **
>
> I'm sure there are rough edges here and there, so feedback is welcome, as
> usual.****
>
> **
>
> **
>
> --
> Mark Kelly - kel at mckinnonsc.vic.edu.au
>
>
-- 
Mark Kelly - kel at mckinnonsc.vic.edu.au
Manager of ICT, Reporting, IT Learning Area
McKinnon Secondary College, McKinnon Rd, McKinnon 3204, Victoria, Australia
Phone: +613 8520 9085, Fax +613 9578 9253
VCE IT Lecture Notes: http://vceit.com
Moderator: IT Applications Edulist <http://edulists.com.au/itapps/index.htm>
Visit Diigo links for ITA <http://groups.diigo.com/group/vce-info-tech> and
SD <http://groups.diigo.com/group/vce-sd>
--
My personal best for the 100 metre sprint is 11.9 metres.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://www.edulists.com.au/pipermail/sofdev/attachments/20121120/3bf7e713/attachment-0001.html 


More information about the sofdev mailing list