The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and...

16
The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and Institutional Change Enrico A. Marcelli & Charles F. Hohm & Jane Kil & Genesis Reyes # Springer Science+Business Media New York 2014 Abstract Employing data (n =734) collected from those having attended the Pacific Sociological Association (PSA) annual meetings held in San Diego, California (2012) and Reno, Nevada (2013), we test whether session quality and host-city satisfaction are positively associated with how respondents rated the overall quality of the meetings. A majority (54 %) of respondents reported the quality of the meetings highly (Above Averageor Excellent), and suggestive of the importance of conference location, this declined from 64 % in 2012 to 41 % in 2013. Controlling for individual characteristics and institutional affiliation, regression results intimate that both host-city satisfaction and session quality are positively associated with how respondents rated the overall quality of the meetings. And they suggest that the former is somewhat more important than the latter for explaining variation in meeting quality. A final stage of our analysis (n =205) finds that attendeesevaluations of changes in how the 2013 PSA meetings were developed and organized are positively associated with meeting quality without diminishing the independent effects of session or location quality. We discuss implica- tions of these results for future PSA meetings, as well as for research investigating how to improve the quality of regional academic conferences more generally. Keywords Conference satisfaction . Location . Presentation quality . Organizational change . Assessment Am Soc DOI 10.1007/s12108-014-9222-0 E. A. Marcelli (*) : C. F. Hohm : J. Kil Department of Sociology, San Diego State University, 5500 Campanile Drive, San Diego, CA 92182, USA e-mail: [email protected] C. F. Hohm e-mail: [email protected] J. Kil e-mail: [email protected] G. Reyes San Diego State University, 5500 Campanile Drive, San Diego, CA 92182, USA e-mail: [email protected]

Transcript of The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and...

Page 1: The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and Institutional Change

The Quality of Recent Pacific Sociological Association(PSA) Meetings: Location, Session Qualityand Institutional Change

Enrico A. Marcelli & Charles F. Hohm & Jane Kil &Genesis Reyes

# Springer Science+Business Media New York 2014

Abstract Employing data (n=734) collected from those having attended the PacificSociological Association (PSA) annual meetings held in San Diego, California (2012)and Reno, Nevada (2013), we test whether session quality and host-city satisfaction arepositively associated with how respondents rated the overall quality of the meetings. Amajority (54 %) of respondents reported the quality of the meetings highly (“AboveAverage” or “Excellent”), and suggestive of the importance of conference location, thisdeclined from 64 % in 2012 to 41 % in 2013. Controlling for individual characteristicsand institutional affiliation, regression results intimate that both host-city satisfactionand session quality are positively associated with how respondents rated the overallquality of the meetings. And they suggest that the former is somewhat more importantthan the latter for explaining variation in meeting quality. A final stage of our analysis(n=205) finds that attendees’ evaluations of changes in how the 2013 PSA meetingswere developed and organized are positively associated with meeting quality withoutdiminishing the independent effects of session or location quality. We discuss implica-tions of these results for future PSA meetings, as well as for research investigating howto improve the quality of regional academic conferences more generally.

Keywords Conference satisfaction . Location . Presentation quality . Organizationalchange . Assessment

Am SocDOI 10.1007/s12108-014-9222-0

E. A. Marcelli (*) : C. F. Hohm : J. KilDepartment of Sociology, San Diego State University, 5500 Campanile Drive, San Diego, CA 92182,USAe-mail: [email protected]

C. F. Hohme-mail: [email protected]

J. Kile-mail: [email protected]

G. ReyesSan Diego State University, 5500 Campanile Drive, San Diego, CA 92182, USAe-mail: [email protected]

Page 2: The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and Institutional Change

Introduction

Periodic assessment of students and academic departments in Sociology is fairlycommon (American Sociological Association 2006; American SociologicalAssociation Task Force on Assessing the Undergraduate Major 2005; AmericanSociological Association Task Force on Sociology and General Education 2007;American Sociological Association Task Force on the Master’s Degree in Sociology2009; American Sociological Association Task Force on the UndergraduateMajor 2004;Cross and Steadman 1996; Godfrey 1998; Hohm and Johnson 2001; Silver and Shulman2008; Spalter-Roth and Erskine 2003; Van Valey 2011). For instance, the AmericanSociological Association (ASA) has provided handbooks and other materials to assistsociology departments in such endeavors for many years (Sally Hillsman, personalcommunication, July 2, 2013) and the Eastern Sociological Society (ESS) has conductedat least one member satisfaction survey (Emily Mahon, personal communication, May13, 2013). However, only recently have various sociologists involved with the PacificSociological Association (PSA) annual meetings followed the ESS’s example by mov-ing assessment beyond academic departments and asking whether PSA leadership andits members might benefit from their own evaluation (Downey and Orr 2014).

After searching the literatures of other academic disciplines, we became aware ofhow at least one—the European Regional Science Association (ERSA)—has assessedparticipation in its conferences (van Dijk and Maier 2006). Regional science is a fieldof the social sciences where the focus is on analytical approaches to problems that arespecifically rural, urban or regional. The authors (van Dijk and Maier 2006) analyzedthe participation of ERSA members at their annual conferences from 1998 to 2003, andthree hypotheses were tested: (1) Participants in all ERSA conferences will be from allmajor European countries and the ranking of countries by number of participants isexpected to be quite similar between conferences; (2) There will be a stable set offrequent participants (3+ conferences) as well as a significant number of one-timeparticipants; and (3) Due to time and money constraints, distance will be an impedi-ment to conference participation. The three hypotheses were tested using informationabout the annual ERSA conferences published in electronic form on CD-ROMs. Allthree hypotheses were confirmed. The authors also looked at the relationship betweenthe frequency of attendance and distance travelled and found that the average distancedtravelled for members who participated three times or less is significantly higher thanfor those members who attended more than three times. However, when the authorsexcluded participants from outside Europe, the average distance travelled was the same(1,300 km), irrespective of the frequency of attendance (van Dijk and Maier 2006:500). Finally, (van Dijk and Maier 2006: 498) found that the popularity of variousEuropean cities was significantly related to conference attendance. The cities of Vienna(Austria), Jyvaskyla (Finland) and Dortland (Germany) were viewed more positivelythan Dublin (Ireland), Barcelona (Spain) and Zagreb (Croatia). The organizationalaspects and formats of the conferences were quite similar with the exception of the2003 conference in Jyvaskyla where “R-sessions” were introduced. These sessions had(1) papers that that were selected by a peer-refereeing process and (2) assigneddiscussants to improve the quality of the papers and to give better papers moreexposure. However, though R sessions were added to the 2003 conference, as in otherERSA conferences, nearly all proposed papers were accepted for presentation. Though

Am Soc

Page 3: The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and Institutional Change

the authors doubt that the introduction of R-sessions had a significant independentimpact on the number of participants, the 2003 conference at Jyvaskyla had a greaternumber of participants than the previous five conferences.

The purpose of our article, which is informed by van Dijk and Maier (2006), isstraightforward. We employ 2012–2013 PSA “Member Satisfaction Survey” data totest whether attendees’ evaluations of overall meeting quality are positively associatedwith (1) reported quality of sessions at the PSA annual meetings, and (2) the locationwhere the meetings were held in 2012 (San Diego, CA) and 2013 (Reno, NV). The firsthypothesis suggests that the satisfaction scholars and students experience while attend-ing academic conferences is a direct function of presenting their research and learningabout that of others. The second intimates that conference attendees are drawn to citiesthat feature attractive climates, artistic and technical innovations, and certain kinds ofactivities or amenities.

Approximately 20 % more of those who attended the meetings in San Diego,California (64 %)—compared to those who attended those in Reno, Nevada(41 %)—rated the overall quality of the meetings “Above Average” or “Excellent.”And as Fig. 1 below illustrates, controlling for age, gender, ethno-racial group andeducational attainment, both host-city satisfaction and presentation quality appear to bepositively associated with overall quality of the meetings. Specifically, the probabilityof having rated the overall quality of the meetings highly (“Above Average” or“Excellent”) rises tenfold—from about 8 % (for those who rated the quality of sessionpresentations as “Not So Good”) to 80 % (for those who rated the quality of sessionpresentations as “Excellent”). The percentage change in the probability of ratingmeeting quality highly is somewhat lower for the same responses to the host-city

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

"Not So Good" "Below Average" "Average" "Above Average" "Excellent"

SATISFACTION WITH HOST CITY OR QUALITY OF SESSION PRESENTATIONS

Quality of Session Presentations Satisfaction with Host City"R

O"E

GARE

VA

EV

OBA"

YT ILA

UQ

GNITEE

MDET

ARG

NIV

AH

FO

YTILI BAB

ORPE

XC

EL

LE

NT

"

Fig. 1 Predicted probability of having rated overall PSA meeting quality “Above Average” or “Excellent”

Am Soc

Page 4: The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and Institutional Change

satisfaction question, from about 18 % for those who reported that the city was “Not SoGood” to approximately 76 % for those who rated the city as “Excellent.” In otherwords, at first glance the overall quality of the meetings seems to be somewhat moresensitive to program quality than to attendees’ evaluation of conference location.

Although results reported below also suggest that those who have not earned acollege degree, and those age 65 years and older, were more likely to rate meetingquality higher than those who were more highly educated and younger, host-citysatisfaction and session presentation quality remain the most important factors forunderstanding variation in meeting quality. And while we estimate that attendees’evaluations of changes in PSA annual meeting program development and organizationprocess during the 2013 cycle are positively and statistically associated with meetingquality, inclusion of these evaluations does not diminish the relationship that bothsession quality and location quality have with meeting quality.

After discussing the PSA Member Satisfaction Survey data we employ and ouranalytical model, we conclude by considering what our results mean for how the PSAmight begin to improve the quality of the annual meetings as well as several promisingdirections for future research.

Data

The PSA’s Annual Meeting Satisfaction Survey was first implemented following the2011 annual meetings in Seattle, Washington. However, because the question regardinghow annual meeting attendees rated the overall quality of the meetings was not askeduntil after the 2012 meetings in San Diego, California, we only employ these and mostrecent (2013) data that were collected after the meetings held in Reno, Nevada.Descriptive results for all three surveys have been published in The PacificSociologist (Hohm 2011, 2012, 2013) and may be accessed on the PSA website(http://pacificsoc.org/member-satisfaction-survey.html). All data were collected usingSurvey Monkey, a software program that permits one to construct and implement anonline questionnaire. Survey invitations and reminders were sent out using SurveyMonkey’s email collector function. Each attendee was emailed a unique link in thesurvey invitation email, to prevent duplicate responses, and only one answer wasallowed per question. Respondents were able to skip a question, if they chose not toanswer.

The 2011 PSA annual meetings were held in Seattle, Washington from March 10ththrough 13th and on April 14th attendees were emailed a request to take the 2011 PSAAnnual Meeting Satisfaction Survey. Five subsequent email messages were sent duringthe next 15 days, and the response rate was 52 % (538 of 1028 completed thequestionnaire). Employing a similar procedure after the 2012 meetings in San Diego,California (four email messages were sent during the 15 days following the first onApril 17th, 2012), the response rate was slightly lower (45 %, or 485 of 1,077attendees). Following the March 21st through 24th, 2013 meetings in Reno,Nevada—on April 9th, 2013, attendees were emailed the 2013 PSA Annual MeetingSatisfaction Survey, and once again a similar follow-up procedure was employed(seven subsequent messages were sent during the 15 days following commencementof survey implementation). Despite this additional follow-up effort, the response rate

Am Soc

Page 5: The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and Institutional Change

was only 39 % (381 of 967 attendees). Why this response rate has declined by 25 %during the past 3 years is worthy of future investigation, but beyond the scope of thispaper.

The data collected in the 2011–2013 PSA Member Satisfaction Surveys may beusefully separated into five analytical categories (Table 1): (1) Meeting quality/satisfaction, (2) Participation, (3) Registration, (4) Demographic characteristics/Institutional affiliation, and (4) 2013 Program Changes. The number and percentageof questions by analytical category, city and year are shown in the matrix below.

The majority of questions (56 % for all 3 years) asks attendees to report on thequality, or satisfaction with, meeting receptions and sessions. Almost one-fifth (19 %)of the questions, again across all three years, inquire about participation in themeetings, and about the same percent ask about registration, demographic characteris-tics or institutional affiliation. Our outcome variable of interest in this article isgenerated from a question that asked “Overall, how would you rate the quality of theSan Diego (or Reno/Sparks) meeting?” and offered five possible responses: (1) “Not SoGood.” (2) “Below Average,” (3) “Average,” (4) “Above Average,” and (5)“Excellent.” We decided, however, to create a binary (“dummy”) version of thisvariable for two basic reasons. First, it is convenient to be able to discuss how a changein an explanatory variable such as session quality, location quality or the favorability ofa PSA programmatic change in 2013 is associated with having rated overall meetingquality highly using probabilistic terminology. Regressing a dummy variable (e.g.,STATA’s logit function) on possible explanatory factors (Hosmer and Lemeshow 1989)and employing a common coefficient conversion formula (Studenmund 2001) permitthis. And second, we obtain very similar regression results when estimating a morecomplicated ordinary least squares (OLS) or multinomial logistic regression modelusing the original five-category response format.

As can be seen in Table 2 below, we have provided definitions for our dummy andall explanatory variables; as well as their means, standard deviations, and lower- andupper-bound values. The initial combined 2012–2013 sample size was 893 (n2012=499,n2013=394), but after dropping 50 observations for those who did not answer themeeting quality question, the sample is reduced to 844 (n2012=478, n2013=366).Slightly more than 100 additional observations lacked a value for at least one of theexplanatory variables listed in Table 2, leaving a final sample of 734 (n2012=410,n2013=324).

Table 1 Number of PSA member satisfaction survey questions by type and year

Seattle, WA (2011) San Diego, CA (2012) Reno, NV (2013)

N % N % N %

1. Quality/Satisfaction 18 56 28 57 29 54

2. Participation 5 16 10 20 11 20

3. Registration 4 13 6 12 6 11

4. Demography/Affiliation 5 16 5 10 5 9

5. 2013 Program Changes 0 0 0 0 3 6

32 100 49 100 54 100

Am Soc

Page 6: The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and Institutional Change

A Model of PSA Meeting Quality

Our logistic regression model of meeting quality can be written as . . .

Yit ¼ αþ βiSession Qualityþ γiLocation Quality

þ δiIndividual=Institutional Characteristicsþ εit;

where Yit is equal to one if overall meeting quality was reported as “Above Average” or“Excellent” by individual attendee “i” in year “t”, and zero if reported as “Not SoGood,” “Below Average” or “Average;” “βi” is a vector of parameter coefficientsrepresenting the associations between individual “Session Quality” variables (i.e.,presentation quality, rigor of research presented, question and answer quality) andmeeting quality; “γi” is a vector of parameter coefficients reflecting the associationsbetween individual “Location Quality” variables (i.e., host-city satisfaction, satisfactionwith location of hotel) and meeting quality; and “δi” is a vector of parameter coeffi-cients for associations between “Individual/Institutional Characteristics” variables (i.e.,age, gender, ethno-racial group, educational attainment, institutional affiliation) andmeeting quality. Variation in meeting quality not accounted for by our three sessionquality, two meeting place, and five individual/institutional characteristics variables (aswell as a year dummy variable included to control for unobserved changes from 2012to 2013) is captured in the error term (εit). It should also be noted that, although notexplicitly indicated in the model above, in a final stage of our analysis we investigatehow attendees’ evaluations of 2013 PSA program and organization process changes areassociated with meeting quality, and whether inclusion of these alters the observedrelationship between session or location quality and meeting quality. The variables thatmay be used for this purpose are also defined in Table 2 and include attendee familiaritywith the 2013 changes in program development and organization, how favorable theyrated these changes, and how favorable they were toward broad program topic areas,the distinction made between formal research presentations and those featuring research“in-progress,” the creation of additional plenary sessions and planning for a lunch breakwhen no sessions were scheduled.

Analysis and Results

Slightly more than half (about 54 %) of those having attended the PSA meetings andhaving responded to the surveys ranked them “highly”—that is, “Above Average” or“Excellent” (Table 2). Concealed within this descriptive statistic, furthermore, is adecline in meeting quality from 2012 to 2013—whereas a majority (approximately64 %) of respondents ranked the overall quality of the 2012 meetings in San Diegohighly, only a minority (41 %) did so regarding the 2013 meetings in Reno, NV. Whileit is also the case that all of the Session and Location Quality variables have meansapproximating “Average” (Satisfaction with Hotel Location) or “Above Average” (theremaining four variables in these two analytical categories), it is interesting to note thatthe means of the three in the Session Quality category rose somewhat from 2012 to2013 and that those of the two in the Meeting Place category fell. For instance, meanpresentation quality rose slightly from 3.6 to 3.9 and mean host-city satisfaction

Am Soc

Page 7: The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and Institutional Change

Table 2 Descriptive statistics, 2012 & 2013 PSA meeting satisfaction survey data

Variable definition μ σ min max

Quality of Annual PSAMeeting

Dummy=1 if respondent rated annnual meetingas “Above Average” or “Excellent”

0.538 — 0 1

Session quality

Presentation Quality (+) Respondent rating of presentation quality: (1)“Not so Good,” (2) “Below Average,” (3)“Average,” (4) “Above Average,” and (5)“Excellent”

3.734 0.807 1 5

Rigor of Research (+) Respondent rating of research rigor: (1) “Notso Good,” (2) “Below Average,” (3)“Average,” (4) “Above Average,” and (5)“Excellent”

3.578 0.823 1 5

Q&A Quality (+) Respondent rating of question and answer period:(1) “Not so Good,” (2) “Below Average,” (3)“Average,” (4) “Above Average,” and (5)“Excellent”

3.764 0.852 1 5

Location quality

Satisfaction withHost City (+)

Respondent host-city satisfaction: (1) “Not soGood,” (2) “Below Average,” (3) “Average,”(4) “Above Average,” and (5) “Excellent”

3.537 1.431 1 5

Satisfaction withHotel Location (+)

Respondent satisfaction of hotel location: (1)“Not so Good,” (2) “Below Average,” (3)“Average,” (4) “Above Average,” and (5)“Excellent”

2.753 1.445 1 5

Individual/Institutional characteristics

Age

Less than 25 years (+/−) Respondent age reported as less than 25 years 0.178 — 0 1

25 to 35 years(Control Group)

Respondent age reported as 25 to 35 years 0.332 — 0 1

36 to 45 years (+/−) Respondent age reported as 36 to 45 years 0.252 — 0 1

46 to 55 years (+/−) Respondent age reported as 36 to 55 years 0.104 — 0 1

56 to 65 years (+/−) Respondent age reported as 56 to 65 years 0.094 — 0 1

Greater than 65 years(+/−)

Respondent age reported as greater than65 years

0.040 — 0 1

Male (+/−) Dummy=1 if respondent reported gender as“male”

0.335 — 0 1

Ethno-racial group

Asian (+/−) Dummy=1 if respondent’s reported ethno-racialgroup is “Asian/Pacific Islander”

0.074 — 0 1

Black (+/−) Dummy=1 if respondent’s reported ethno-racialgroup is “African American”

0.034 — 0 1

Latino (+/−) Dummy=1 if respondent’s reported ethno-racialgroup is “Hispanic”

0.093 — 0 1

White (Control Group) Dummy=1 if respondent’s reported ethno-racialgroup is “Caucasian”

0.673 — 0 1

Other (+/−) Dummy=1 if respondent’s reported ethno-racialgroup is “Native American/AmericanIndian/Alaskan Native,” “Mutiracial,” or“Other pealse specify)”

0.127 — 0 1

Am Soc

Page 8: The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and Institutional Change

declined from 4.4 (“Above Average”) to 2.4 (“Below Average”). These changes arereported in Fig. 2 below, and suggest that declining 2012–13 PSA meeting quality ismore likely a function of declining satisfaction with meeting location than sessionquality (which actually rose slightly).

Table 2 (continued)

Variable definition μ σ min max

Education attainment

High School (+) Dummy=1 if respondent has a “High SchoolDiploma or Equivalent” but no “Bachelor’sDegree”

0.142 — 0 1

College (+) Dummy=1 if respondent has a “Bachelor’sDegree”

0.061 — 0 1

Master’s (+) Dummy=1 if respondent has completed atleast his or her “1st year Master’s” but hasnot earned a “Ph.D. Degree”

0.343 — 0 1

Doctorate(Control Group)

Dummy=1 if respondent has a “Ph.D. Degree” 0.454 — 0 1

Institutional affiliation

Community College (+) Dummy=1 if respondent institutional affiliationis “Community College,” “AppliedSociologist,” or “Other (please specify)”

0.079 — 0 1

Four-year College(Control Group)

Dummy=1 if respondent institutional affiliationis “Four-year college”

0.369 — 0 1

Master’s Program (−) Dummy=1 if respondent institutional affiliationis “Master’s”

0.223 — 0 1

Doctoral Program (−) Dummy=1 if respondent institutional affiliationis “Doctoral”

0.328 — 0 1

Survey Year = 2013 (+/−) Dummy=1 if year of survey=2013 0.441 — 0 1

2013 Meeting Program Changesa

Familiarity with 2013Changes (+/−)

Subject (1) unfamiliar, (2) somewhat familiar, or(3) “very familiar” with changes.

— — — —

Favorability towardchanges overall (+)

Subject rated program development/organization process from (1) “veryunfavorable” to (5) “very favorable.”

— — — —

Broad TopicalAreas (+)

Subject rated broad topical submission areasfrom (1) “very unfavorable” to “veryfavorable.”

— — — —

In-progress vs.Formal Research (+)

Subject rated distinctions between formalpresentations from (1) “very unfavorable”to (5) “very favorable.”

— — — —

Plenary Sessions (+) Subject rated additional plenary sessions from (1)“very Uuable” to (5) “very favorable.”

— — — —

Lunch Break (+) Subject rated uninterrupted lunch break from (1)“very unfavorable” to (5) “very unfavorable.”

— — — —

a There were 244 respondents who answered questions regarding 2013 PSA program changes. Of these 205also had values for all other remaining varaibles in our model

Am Soc

Page 9: The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and Institutional Change

Although it is not possible to determine the mean age of survey respondents becauseof the way in which this question was asked, we can see from Table 2 that about one-third were between the ages of 25 and 35 years, another 25 % were 36 to 45 years old,and about 18 % were younger than 25 years old. We also see that two-thirds ofrespondents were female, a slightly higher percentage (67 %) claimed to be white,and most earned either their doctorate (45 %) or a Master’s degree (34 %). In terms ofinstitutional affiliation, we see that about 37 % were employed at a four-year college,33 % at a university granting Ph.D.s, and 22 % at a college or university offeringMaster’s degrees. Only 8 % worked at a community college or other academic or non-academic institution. Lastly, fewer (44 % of) respondents in the sample were includedin 2013 compared to 2012.

Table 3 reports three sets of parameter coefficients, standard errors and statisticalconfidence levels generated when “logistically” regressing meeting quality on allexplanatory variables in our model (again, excluding those reflecting familiarity andfavorability with 2013 program changes) for both years, and 2012 and 2013 separately.Results for combined years are reported in the first two columns of Table 3, results for2012 only are reported in columns three and four, and results for 2013 are reported inthe last two columns. While results vary slightly for 2012 and 2013, the main messagerevealed here is that both session quality and satisfaction with meeting location arestrongly and positively related to PSA members’ evaluations of the annual meetings,and that those who are older and have not earned a college degree rated the meetingsmore highly. Model test statistics reported at the bottom of Table 3 suggest that oursession-/location-quality model correctly classifies whether a respondent rated theannual meetings highly 76–80 % of the time.

0.0

0.5

1.0

1.5

2.0

2.5

3.0

3.5

4.0

4.5

Quality of Annual PSA Meeting

Presentation Quality (+)

Rigor of Research (+)

Q&A Quality (+) Satisfaction with Host City (+)

Satisfaction with Hotel Location (+)

2012 2013

Qua

lity

"tn ellecxE")5(ot

"dooG

oStoN")1(

mor F:scitsi ret carahCgn itee

Md etceleSf o

Fig. 2 Attendee evaluations of overall meeting quality and meeting characteristics, 2012 & 2013

Am Soc

Page 10: The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and Institutional Change

To make the interpretation of these results more meaningful and useful (e.g.,comparing the relative import of session and location quality factors for understandingvariation in meeting quality), we convert estimated parameter coefficients reported in

Table 3 Results of regressing high meeting quality on quality of sessions and location

2012 & 2013 2012 2013

β S.E. β S.E. β S.E.

Session quality

Presentation Quality (+) 0.950 0.195a 0.865 0.26a 1.311 0.327a

Rigor of Research (+) 0.250 0.183c 0.295 0.241 0.073 0.307

Q&A Quality (+) 0.440 0.139a 0.608 0.184a 0.206 0.23

Location quality

Satisfaction with Host City (+) 0.708 0.114a 0.539 0.166a 0.811 0.19a

Satisfaction with Hotel Location (+) 0.415 0.082a 0.362 0.095a 0.6 0.183a

Individual/Institutional characteristics

Age

Less than 25 years (+/−) −0.326 0.384 −0.83 0.539 0.188 0.571

36 to 45 years (+/−) 0.188 0.290 −0.08 0.375 0.677 0.509

46 to 55 years (+/−) 0.270 0.393 −0.25 0.526 1.071 0.647c

56 to 65 years (+/−) 0.319 0.389 0.054 0.529 0.796 0.65

Greater than 65 years (+/−) 1.138 0.601b — — 0.34 0.809

Male (+/−) 0.090 0.211 0.106 0.284 −0.05 0.339

Ethno-racial group

Asian (+/−) −0.243 0.379 −0.02 0.508 −0.8 0.616

Black (+/−) −0.812 0.536 −0.37 0.855 −1.26 0.734c

Latino (+/−) −0.310 0.342 0.098 0.507 −0.63 0.52

Other (+/−) 0.077 0.297 −0.04 0.405 0.295 0.482

Education attainment

High School (+) 1.045 0.483b 0.909 0.67b 1.277 0.754b

College (+) 0.482 0.508 0.542 0.729 0.662 0.77

Master’s (+) −0.011 0.291 0.008 0.398 −0 0.488

Institutional affiliation

Community College (+) −0.581 0.423 −0.09 0.574 −1.68 0.765

Master’s Program (−) −0.043 0.292 0.098 0.406 −0.15 0.458

Doctoral Program (−) −0.098 0.297 −0.16 0.425 −0.1 0.454

Survey Year=2013 (+/−) 0.141 0.272 — — — —

Intercept (±) −9.713 0.898a −9.18 1.205a −10.3 1.414a

Percent Concordant Pairs 77.0 % 76.7 % 79.6 %

Prob > X2 0.000 0.000 0.000

Pseudo R2 0.328 0.286 0.360

N (Sample Size) 734 395 324

Estimated parameter coefficients (β) are reported above as statistically significant at the 90 (a), 95 (b) or 99 (c)percent confidence level

Am Soc

Page 11: The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and Institutional Change

the first column of Table 3 into predicted probability changes (Fig. 3) using one of threeconventional methods often employed by economists (Studenmund 2001: 457–459).

Specifically, each horizontal bar represents the change in the predicted probability ofa respondent having rated meeting quality highly given a one-unit increase in theassociated explanatory variable. When the explanatory variable is continuous (e.g.,presentation quality, satisfaction with host city) we multiply the parameter coefficientby the mean of the dependent variable, one minus this mean, and the standard deviationof the explanatory variable. So, for instance, the positive parameter coefficient reportedin Table 3 for “Presentation Quality” (0.950) is interpreted as follows: a one-unitincrease (of one standard deviation or 0.195 in this case) in presentation quality isassociated with a 19 % higher probability—computed from 0.950×0.538×(1–0.538)×0.807—of having rated meeting quality as “Above Average” or “Excellent.” When theexplanatory variable is dichotomous (e.g., “Greater than 64 Years,” “High School”), theparameter coefficient is multiplied by the mean of the dependent variable and one minusthis given that we are computing how a one-unit change of this variable (which equalsone in such cases) is associated with a change in the probability of having rated meetingquality highly. To take an example of this type, not having earned a college degree isassociated with a 26 % increase—computed from 1.045×0.538×(1–0.538)—in theprobability of having rated meeting quality as “Above Average” or “Excellent” com-pared to having earned a Ph.D. and holding all other variables in our model constant.

There are two types of bars shown in Fig. 3. Those that are filled represent variablesthat are estimated to be statistically associated with having rated meeting quality highly.Those that are empty (e.g., “Male,” “Asian,” “College”) are not statistically significantat the 90 % confidence level. Thus, we see that among our Session Quality and Meeting

-30% -20% -10% 0% 10% 20% 30%

Survey Year = 2013 (+/-)

Doctoral Program (-)Master's Program ( -)

Community College (+)

Master's (+)College (+)

High School (+)

Other (+/-)Latino (+/-)Black (+/-)Asian (+/-)

Male (+/ -)

Greater than 65 years (+/ -)56 to 65 years (+/ -)46 to 55 years (+/ -)36 to 45 years (+/ -)

Less than 25 years (+/-)

Satisfaction with Hotel Location (+)Satisfaction with Host City (+)

Q&A Quality (+)Rigor of Research (+)

Presentation Quality (+)

Percent Change in Probabilityof Meeting Quality Reported as "Above Average" or "Excellent"

Session Quality

Meeting Place

Individual & Institutional

Characteristics

Fig. 3 Changes in predicted probability of rating meeting quality highly, 2012 & 2013

Am Soc

Page 12: The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and Institutional Change

Place variables, satisfaction with the host city was most strongly associated with ourrespondents’ overall ratings of the meetings. Not only is it significant at the 99 %confidence level (Table 3), but it also is substantively more related to meeting quality (a26 % increase). The second most important explanatory variable is presentation quality(a 19 % increase), followed by satisfaction with the location of the hotel (a 15 %increase), “Q&A Quality” (a 9 % increase), and “Rigor of Research” (a 5 % increase).

The final stage of our regression analysis investigates whether attendees’ favorabilityratings of changes related to 2013 PSA program development are associated withoverall meeting quality, and whether inclusion of these variables alters our initialfindings regarding session and location quality. Table 4 (in the two columns labeled“Model 1”) reports all regression model test statistics, and once again we find thatsession and location quality are positively and significantly associated with meetingquality (before introducing any 2013 program change evaluation variables and as seenin the first two columns). Model 2 shows results after introducing a variable capturinghow favorable attendees rated “changes in the program development and organizationprocess.” This variable is estimated to be positively and statistically associated withmeeting quality and does not substantively alter the observed relationship between ourtwo main explanatory variables of interest (session and location quality). Model 3’sresults also are fairly consistent with results reported in Table 3 when using the fullsample (2012 & 2013 data), and it appears that the more favorable respondents weretoward the creation of additional plenary sessions (that is, time slots with a singlesession scheduled), the more likely they were to rate overall meeting quality highly.This is the only specific program change variable that is statistically significant whenall four are included simultaneously. It should be noted, however, that when we ranModel 3 with only one of the four program changes variables at a time, the distinctionbetween formal vs. in-progress research and having lunch when other session are notscheduled were positively and statistically associated with meeting quality. In otherwords, while having more plenary sessions is estimated to have been the mostimportant program change in terms of explaining variation in meeting quality, havingsessions distinguished by research quality and not scheduling lunch when sessions arebeing held also seem to have been related to meeting quality.

Figure 4 shows the percent change in the probability of a respondent having ratedthe 2012 or 2013 PSA annual meeting highly (“Above Average” or “Excellent”) asresult of a one standard deviation change in reported session quality, location quality orfavorability of a 2013 PSA program change. Specifically, before introducing anyevaluations of 2013 PSA program changes, it appears that an increase of slightly lessthan one (on a scale of one to five) in Presentation Quality is associated with about a20 % increase in the probability of a respondent having rated meeting quality highly. InModel 2 (of Table 4) we added the general favorability rating for all 2013 PSA programchanges, and the darker green bar at the bottom of Fig. 4 shows that an increase ofslightly less than one (on a scale of one to five) in this variables is associated with aboutan 8 % rise in the probability of a respondent having rated meeting quality highly.Importantly, inclusion of this variable does not much alter the relationship betweensession (presentation) quality and meeting quality, as the second Presentation Qualitybar is only slightly longer than the first. When inserting the four specific 2013 PSAprogram change variables instead of the one general favorability metric, we see that anincrease of slightly less than one (on a scale of one to five) in the Plenary Session

Am Soc

Page 13: The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and Institutional Change

Table 4 Results of regressing high meeting quality on 2013 program/organizational changes

Model 1 Model 2 Model 3

β S.E. β S.E. β S.E.

Session quality

Presentation Quality (+) 1.147 0.426a 1.157 0.432a 1.411 0.47a

Rigor of Research (+) 0.326 0.384 0.351 0.389 0.167 0.411

Q&A Quality (+) 0.13 0.308 0.137 0.314 0.136 0.328

Location quality

Satisfaction with Host City (+) 0.795 0.247a 0.81 0.251a 0.88 0.267a

Satisfaction with Hotel Location (+) 0.619 0.223a 0.639 0.23a 0.709 0.248a

Individual/Institutional characteristics

Age

Less than 25 years (+/−) −0.29 1.022 −0.46 1.027 −0.24 1.119

36 to 45 years (+/−) 0.938 0.613c 0.945 0.619c 1.095 0.667b

46 to 55 years (+/−) 1.296 0.74b 1.353 0.751b 1.491 0.805b

56 to 65 years (+/−) 0.722 0.762 0.761 0.765 0.627 0.811

Greater than 65 years (+/−) −0.34 1.034 −0.27 1.045 −0.6 1.066

Male (+/−) −0.08 0.422 −0.04 0.426 −0.02 0.437

Ethno-racial group

Asian (+/−) −0.72 0.701 −0.72 0.702 −0.73 0.729

Black (+/−) −1.01 0.962 −1.19 0.987 −1.55 1.035c

Latino (+/−) −0.16 0.664 −0.14 0.655 −0.31 0.687

Other (+/−) 0.102 0.647 0.302 0.653 0.338 0.696

Education attainment

High School (+) 1.442 1.187 1.604 1.168 1.047 1.274

College (+) −0.72 1.661 −0.49 1.78 −0.84 1.803

Master’s (+) 0.004 0.607 0.07 0.614 0.243 0.661

Institutional affiliation

Community College (+) −1.16 0.983 −1.15 0.995 −1.31 1.047

Master’s Program (−) 0.356 0.554 0.479 0.563 0.298 0.578

Doctoral Program (−) 0.5 0.557 0.449 0.56 0.332 0.592

2013 Meeting Program Changes*

Favorability toward changes overall (+) — — 0.426 0.247b — —

Broad Topical Areas (+) — — — — 0.034 0.241

In-progress vs. Formal Research (+) — — — — −0.08 0.245

Plenary Sessions (+) — — — — 0.757 0.295a

Lunch Break (+) — — — — 0.199 0.273

Intercept (±) −10.8 1.887a −13 2.351a −15.5 2.723a

Percent Concordant Pairs 78.5 % 80.5 % 82.0 %

Prob > X2 0.000 0.000 0.000

Pseudo R2 0.363 0.375 0.406

N (Sample Size) 205 205 205

Estimated parameter coefficients (β) are reported above as statistically significant at the 90 (a), 95 (b) or 99 (c)percent confidence level

Am Soc

Page 14: The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and Institutional Change

variable is related with about a 16 % rise in the probability that a respondent ratedmeeting quality highly. A one standard deviation increase in Presentation Quality isactually associated, in this case, with a somewhat higher (24 %) rise in the probabilityof having reported meeting quality highly. The relationship between the two LocationQuality variables and meeting quality, when no 2013 PSA program change variable isinclude or when either the general or four specific variables are, can be interpretedsimilarly. But the main finding here is that both session (presentation) quality andlocation quality appear to be positively associated with meeting quality, and the latter issomewhat more important. There is also some evidence that the PSA program changesimplemented during the 2013 cycle (e.g., program layout, time available for discussion)but not reported here were also positively related to meeting quality, but these links aresomewhat weaker and inclusion of such factors in our analyses does not alter our mainfindings.

Summary

Our regression results suggest that participant-rated “session quality” and “host-citysatisfaction” are positively associated with respondents’ rating of the overall quality ofthe 2012 San Diego, CA and 2013 Reno, NV Pacific Sociological Association (PSA)meetings—even after controlling for individual characteristics (e.g., age, ethno-racialgroup, educational attainment) and institutional affiliation (e.g., Ph.D.-granting univer-sity, four-year undergraduate college). Satisfaction with where the PSA conferenceswere held is somewhat more important in explaining overall quality of the meetings

-30% -20% -10% 0% 10% 20% 30%

Favorability toward changes overall (+)

Lunch Break (+)

Plenary Sessions (+)

In-progress vs. Formal Research (+)

Broad Topical Areas (+)

Satisfaction with Hotel Location (+)

Satisfaction with Host City (+)

Q&A Quality (+)

Rigor of Research (+)

Presentation Quality (+)

Percent Change in Probabilityof Meeting Quality Reported as "Above Average" or "Excellent"

Session Quality

Location Quality

Individual & Institutional

Characteristics

Fig. 4 Changes in predicted probability of rating meeting quality highly, 2013

Am Soc

Page 15: The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and Institutional Change

than is session quality. And changes in how the 2013 PSA meetings were developedand organized were also found to be positively associated with respondents’ rating ofthe overall quality of the meetings without diminishing the independent effects ofsession or location quality.

What other issues related to regional sociological association meetings might beusefully studied? An issue that we believe deserves attention is the “no show” issue.“No shows” refer to colleagues whose papers were accepted and who were listed on theprogram, but fail to come to the meetings without informing anyone of this. But who failsto show up and why? Although the 2012–2013 PSA Members Satisfaction Survey dataemployed in this paper do not permit an analysis of “no shows,”Downey and Orr (2014)investigate this problem in this special issue of theAmerican Sociologist using informationgathered by the PSA office, using two-page assessment forms customized for each sessionof the 2012–2013 annual meetings. Preliminary analysis shows that fully one-third ofpapers on the PSA program for 2012 were not presented at the 2012 PSA meeting!

Yet another issue worthy of future research is the inclusion of papers based onresearch that is not completed with those based on work that is finished. Unfortunately,as was the case with “no shows,” the PSA has not had any data to address this concern,until recently (Downey and Orr 2014). Finally, an issue that is of grave concern to thePSA, and that has been addressed in a number of the articles in this special issue of theAmerican Sociologist, is the issue of attracting more faculty from Ph.D.-grantingSociology Departments. We propose, based on findings reported in this paper, thatfocusing in improving PSA session and location quality may be important initial stepstoward addressing the issues of “no-shows” and low turnout among faculty of Ph.D.-granting Sociology Departments.

References

American Sociological Association Task Force on Assessing the Undergraduate Major. (2005). Creating aneffective assessment plan for the sociology major. Washington, D.C.

American Sociological Association Task Force on Sociology and General Education. (2007). Sociology andgeneral education. Washington, D.C.

American Sociological Association Task Force on the Master’s Degree in Sociology. (2009). Thinking about themaster’s degree in sociology: Academic, applied, professional, and everything in between. Washington, D.C.

American Sociological Association Task Force on the Undergraduate Major. (2004). Liberal learning and thesociology major updated: Meeting the challenge of teaching sociology in the twenty-first century.Washington, D.C.

American Sociological Association, Research and Development Department. (2006). What can I DO with abachelor’s degree in sociology?: A national survey of seniors majoring in sociology. Washington, D.C.

Cross, P. K., & Steadman, M. H. (1996). Classroom research: Implementing the scholarship of teaching. SanFrancisco: Jossey-Bass.

Downey, D., & Orr, A. (2014). Between scylla and charybdis: designing, implementing, and assessinginnovations in the annual PSA meetings. The American Sociologist. doi:10.1007/s12108-014-9224-y.

Godfrey, E. P. (Ed.). (1998). Teaching sociology at small institutions. Washington, D.C.: AmericanSociological Association.

Hohm, C. F. (2011). 2011 PSA annual meeting satisfaction survey. The Pacific Sociologist, 19(3). http://pacificsoc.typepad.com/newsletters/psa%20newsletter_0911.pdf.

Hohm, C. F. (2012). 2012 PSA annual meeting satisfaction survey. The Pacific Sociologist, 20(3). http://pacificsoc.typepad.com/newsletters/psa_newsletter_0912.pdf.

Hohm, C. F. (2013). 2013 PSA annual meeting satisfaction survey. The Pacific Sociologist, 21(3). http://pacificsoc.typepad.com/newsletters/psa%20newsletter_0913.pdf.

Am Soc

Page 16: The Quality of Recent Pacific Sociological Association (PSA) Meetings: Location, Session Quality and Institutional Change

Hohm, C. F., & Johnson, W. S. (2001). Assessing student learning in sociology (2nd ed.). Washington, D.C.:American Sociological Association.

Hosmer, D. W., & Lemeshow, S. (1989). Applied logistic regression. New York: Wiley.Silver, I., & Shulman, D. (Eds.). (2008). Academic street smarts: Informal professionalization of graduate

students. Washington, D.C.: American Sociological Association.Spalter-Roth, R. M., & Erskine, W. B. (2003).How does your department compare?: A peer analysis from the

AY 2001–2002 survey of baccalaureate and graduate programs in sociology. Washington, D.C.:American Sociological Association.

Studenmund, A. H. (2001). Using econometrics: A practical guide (4th ed.). Boston: Addison WesleyLongman, Inc.

van Dijk, J., & Maier, G. (2006). ERSA conference participation: does location matter? Papers in RegionalScience, 85(4), 483–504.

Van Valey, T. L. (Ed.). (2011). Peer review of teaching: Lessons from and for departments of sociology.Washington, D.C.: American Sociological Association.

Am Soc